Estimates coefficients of a linear regression model.

betahat(X, y, qr = TRUE)

Arguments

X

n by k numeric matrix. The data matrix \(\mathbf{X}\) (also known as design matrix, model matrix or regressor matrix) is an \(n \times k\) matrix of \(n\) observations of \(k\) regressors, which includes a regressor whose value is 1 for each observation on the first column.

y

Numeric vector of length n or n by 1 matrix. The vector \(\mathbf{y}\) is an \(n \times 1\) vector of observations on the regressand variable.

qr

Logical. If TRUE, use QR decomposition when normal equations fail. If FALSE, use singular value decompositon when normal equations fail.

Value

Returns \(\boldsymbol{\hat{\beta}}\), that is, a \(k \times 1\) vector of estimates of \(k\) unknown regression coefficients estimated using ordinary least squares.

Details

Calculates coefficients using the normal equation. When that fails, QR decomposition is used when qr = TRUE or singular value decomposition when qr = FALSE.

References

Wikipedia: Linear regression

Wikipedia: Ordinary least squares

Wikipedia: Inverting the matrix of the normal equations

Wikipedia: QR decomposition

Wikipedia: Singular value decomposition

Wikipedia: Orthogonal decomposition methods

Wikipedia: Design matrix

See also

Author

Ivan Jacob Agaloos Pesigan

Examples

# Simple regression------------------------------------------------ X <- jeksterslabRdatarepo::wages.matrix[["X"]] X <- X[, c(1, ncol(X))] y <- jeksterslabRdatarepo::wages.matrix[["y"]] betahat(X = X, y = y)
#> betahat #> constant 4.874251 #> age 0.197486
# Multiple regression---------------------------------------------- X <- jeksterslabRdatarepo::wages.matrix[["X"]] # age is removed X <- X[, -ncol(X)] betahat(X = X, y = y)
#> betahat #> constant -7.1833382 #> gender -3.0748755 #> race -1.5653133 #> union 1.0959758 #> education 1.3703010 #> experience 0.1666065