Derives the slopes \(\boldsymbol{\beta}_{2, \cdots, k}\) of a linear regression model (\(\boldsymbol{\beta}\) minus the intercept) as a function of covariances.

slopes(X, y)

Arguments

X

n by k numeric matrix. The data matrix \(\mathbf{X}\) (also known as design matrix, model matrix or regressor matrix) is an \(n \times k\) matrix of \(n\) observations of \(k\) regressors, which includes a regressor whose value is 1 for each observation on the first column.

y

Numeric vector of length n or n by 1 matrix. The vector \(\mathbf{y}\) is an \(n \times 1\) vector of observations on the regressand variable.

Value

Returns the slopes \(\boldsymbol{\beta}_{2, \cdots, k}\) of a linear regression model derived from the variance-covariance matrix.

Details

The linear regression slopes are calculated using $$ \boldsymbol{\beta}_{2, \cdots, k} = \boldsymbol{\Sigma}_{\mathbf{X}}^{T} \boldsymbol{\sigma}_{\mathbf{y}, \mathbf{X}} $$

where

  • \(\boldsymbol{\Sigma}_{\mathbf{X}}\) is the \(p \times p\) covariance matrix of the regressor variables \(X_2, X_3, \cdots, X_k\) and

  • \(\boldsymbol{\sigma}_{\mathbf{y}, \mathbf{X}}\) is the \(p \times 1\) column vector of the covariances between the regressand \(y\) variable and regressor variables \(X_2, X_3, \cdots, X_k\)

See also

Other parameter functions: .intercept(), .slopesprime(), .slopes(), intercept(), sigma2epsilon(), slopesprime()

Author

Ivan Jacob Agaloos Pesigan