R/SS.R
    dot-ESS.RdCalculates the explained sum of squares \(\left( \mathrm{ESS} \right)\) using $$ \mathrm{ESS} = \sum_{i = 1}^{n} \left( \hat{Y}_{i} - \bar{Y} \right)^2 \\ = \sum_{i = 1}^{n} \left( \hat{\beta}_{1} + \hat{\beta}_{2} X_{2i} + \hat{\beta}_{3} X_{3i} + \dots + \hat{\beta}_{k} X_{ki} - \bar{Y} \right)^2 $$ In matrix form $$ \mathrm{ESS} = \sum_{i = 1}^{n} \left( \mathbf{\hat{y}} - \mathbf{\bar{Y}} \right)^2 \\ = \sum_{i = 1}^{n} \left( \mathbf{X} \boldsymbol{\hat{\beta}} - \mathbf{\bar{Y}} \right)^2 $$ where \(\mathbf{\hat{y}}\) \(\left( \mathbf{X} \boldsymbol{\hat{\beta}} \right)\) is an \(n \times 1\) matrix of predicted values of \(\mathbf{y}\), and \(\mathbf{\bar{Y}}\) is the mean of \(\mathbf{y}\). Equivalent computational matrix formula $$ \mathrm{ESS} = \boldsymbol{\hat{\beta}}^{\prime} \mathbf{X}^{\prime} \mathbf{X} \boldsymbol{\hat{\beta}} - n \mathbf{\bar{Y}}^{2}. $$ Note that $$ \mathrm{TSS} = \mathrm{ESS} + \mathrm{RSS} . $$
.ESS(yhat = NULL, ybar = NULL, X, y, betahat = NULL)
| yhat | Numeric vector of length   | 
    
|---|---|
| ybar | Numeric.
Mean of   | 
    
| X | 
  | 
    
| y | Numeric vector of length   | 
    
| betahat | Numeric vector of length   | 
    
Returns the explained sum of squares \(\left( \mathrm{ESS} \right)\).
If yhat = NULL, it is computed using yhat()
with X and y as required arguments and betahat as an optional argument.
Wikipedia: Residual Sum of Squares
Wikipedia: Explained Sum of Squares
Wikipedia: Total Sum of Squares
Wikipedia: Coefficient of Determination
Ivan Jacob Agaloos Pesigan