Estimates of Regression Slopes \(\boldsymbol{\hat{\beta}}_{2, \cdots, k}\)

slopeshat(X, y)

Arguments

X

n by k numeric matrix. The data matrix \(\mathbf{X}\) (also known as design matrix, model matrix or regressor matrix) is an \(n \times k\) matrix of \(n\) observations of \(k\) regressors, which includes a regressor whose value is 1 for each observation on the first column.

y

Numeric vector of length n or n by 1 matrix. The vector \(\mathbf{y}\) is an \(n \times 1\) vector of observations on the regressand variable.

Value

Returns the estimated slopes \(\boldsymbol{\hat{\beta}}_{2, \cdots, k}\) of a linear regression model derived from the estimated variance-covariance matrix.

Details

Estimates of the linear regression slopes are calculated using $$ \boldsymbol{\hat{\beta}}_{2, \cdots, k} = \boldsymbol{\hat{\Sigma}}_{\mathbf{X}}^{T} \boldsymbol{\hat{\sigma}}_{\mathbf{y}, \mathbf{X}} $$

where

  • \(\boldsymbol{\hat{\Sigma}}_{\mathbf{X}}\) is the \(p \times p\) covariance matrix of the regressor variables \(X_2, X_3, \cdots, X_k\) and

  • \(\boldsymbol{\hat{\sigma}}_{\mathbf{y}, \mathbf{X}}\) is the \(p \times 1\) column vector of the covariances between the regressand \(y\) variable and regressor variables \(X_2, X_3, \cdots, X_k\)

See also

Author

Ivan Jacob Agaloos Pesigan

Examples

# Simple regression------------------------------------------------ X <- jeksterslabRdatarepo::wages.matrix[["X"]] X <- X[, c(1, ncol(X))] y <- jeksterslabRdatarepo::wages.matrix[["y"]] slopeshat(X = X, y = y)
#> slopes #> age 0.197486
# Multiple regression---------------------------------------------- X <- jeksterslabRdatarepo::wages.matrix[["X"]] # age is removed X <- X[, -ncol(X)] slopeshat(X = X, y = y)
#> slopes #> gender -3.0748755 #> race -1.5653133 #> union 1.0959758 #> education 1.3703010 #> experience 0.1666065