Calculates the residual sum of squares \(\left( \mathrm{RSS} \right)\) using $$ \mathrm{RSS} = \sum_{i = 1}^{n} \left( Y_i - \hat{Y}_i \right)^2 \\ = \sum_{i = 1}^{n} \left( Y_i - \left[ \hat{\beta}_{1} + \hat{\beta}_{2} X_{2i} + \hat{\beta}_{3} X_{3i} + \dots + \hat{\beta}_{k} X_{ki} \right] \right)^2 \\ = \sum_{i = 1}^{n} \left( Y_i - \hat{\beta}_{1} - \hat{\beta}_{2} X_{2i} - \hat{\beta}_{3} X_{3i} - \dots - \hat{\beta}_{k} X_{ki} \right)^2 . $$ In matrix form $$ \mathrm{RSS} = \sum_{i = 1}^{n} \left( \mathbf{y} - \mathbf{\hat{y}} \right)^{2} \\ = \sum_{i = 1}^{n} \left( \mathbf{y} - \mathbf{X} \boldsymbol{\hat{\beta}} \right)^{2} \\ = \left( \mathbf{y} - \mathbf{X} \boldsymbol{\hat{\beta}} \right)^{\prime} \left( \mathbf{y} - \mathbf{X} \boldsymbol{\hat{\beta}} \right) . $$ Or simply $$ \mathrm{RSS} = \sum_{i = 1}^{n} \boldsymbol{\hat{\varepsilon}}_{i}^{2} = \boldsymbol{\hat{\varepsilon}}^{\prime} \boldsymbol{\hat{\varepsilon}} $$ where \(\boldsymbol{\hat{\varepsilon}}\) is an \(n \times 1\) vector of residuals, that is, the difference between the observed and predicted value of \(\mathbf{y}\) \(\left( \boldsymbol{\hat{\varepsilon}} = \mathbf{y} - \mathbf{\hat{y}} \right)\). Equivalent computational matrix formula $$ \mathrm{RSS} = \mathbf{y}^{\prime} \mathbf{y} - 2 \boldsymbol{\hat{\beta}} \mathbf{X}^{\prime} \mathbf{y} + \boldsymbol{\hat{\beta}}^{\prime} \mathbf{X}^{\prime} \mathbf{X} \boldsymbol{\hat{\beta}}. $$ Note that $$ \mathrm{TSS} = \mathrm{ESS} + \mathrm{RSS}. $$
.RSS(epsilonhat = NULL, X, y, betahat = NULL)
epsilonhat | Numeric vector of length |
---|---|
X |
|
y | Numeric vector of length |
betahat | Numeric vector of length |
Returns residual sum of squares \(\left( \mathrm{RSS} \right)\).
If epsilonhat = NULL
, \(\left( \mathrm{RSS} \right)\) is computed with X
and y
as required arguments
and betahat
as an optional argument.
Wikipedia: Residual Sum of Squares
Wikipedia: Explained Sum of Squares
Wikipedia: Total Sum of Squares
Wikipedia: Coefficient of Determination
Ivan Jacob Agaloos Pesigan