Derives the slopes \(\boldsymbol{\beta}_{2, \cdots, k}\) of a linear regression model (\(\boldsymbol{\beta}\) minus the intercept) as a function of covariances.

.slopes(SigmaX = NULL, sigmayX = NULL, X, y)

Arguments

SigmaX

p by p numeric matrix. \(p \times p\) matrix of variances and covariances between regressor variables \({X}_{2}, {X}_{3}, \cdots, {X}_{k}\) \(\left( \boldsymbol{\Sigma}_{\mathbf{X}} \right)\).

sigmayX

Numeric vector of length p or p by 1 matrix. \(p \times 1\) vector of covariances between the regressand \(y\) variable and regressor variables \(X_2, X_3, \cdots, X_k\) \(\left( \boldsymbol{\sigma}_{\mathbf{y}, \mathbf{X}} = \left\{ \sigma_{y, X_2}, \sigma_{y, X_3}, \cdots, \sigma_{y, X_k} \right\}^{T} \right)\).

X

n by k numeric matrix. The data matrix \(\mathbf{X}\) (also known as design matrix, model matrix or regressor matrix) is an \(n \times k\) matrix of \(n\) observations of \(k\) regressors, which includes a regressor whose value is 1 for each observation on the first column.

y

Numeric vector of length n or n by 1 matrix. The vector \(\mathbf{y}\) is an \(n \times 1\) vector of observations on the regressand variable.

Value

Returns the slopes \(\boldsymbol{\beta}_{2, \cdots, k}\) of a linear regression model derived from the variance-covariance matrix.

Details

The linear regression slopes are calculated using $$ \boldsymbol{\beta}_{2, \cdots, k} = \boldsymbol{\Sigma}_{\mathbf{X}}^{T} \boldsymbol{\sigma}_{\mathbf{y}, \mathbf{X}} $$

where

  • \(\boldsymbol{\Sigma}_{\mathbf{X}}\) is the \(p \times p\) covariance matrix of the regressor variables \(X_2, X_3, \cdots, X_k\) and

  • \(\boldsymbol{\sigma}_{\mathbf{y}, \mathbf{X}}\) is the \(p \times 1\) column vector of the covariances between the regressand \(y\) variable and regressor variables \(X_2, X_3, \cdots, X_k\)

See also

Other parameter functions: .intercept(), .slopesprime(), intercept(), sigma2epsilon(), slopesprime(), slopes()

Author

Ivan Jacob Agaloos Pesigan