R语言做岭回归

ridge regression可以用来处理下面两类问题:一是数据点少于变量个数;二是变量间存在共线性。

当变量间存在共线性的时候,最小二乘回归得到的系数不稳定,方差很大。这是因为系数矩阵X与它的转置矩阵相乘得到的矩阵不能求得其逆矩阵,而ridge regression通过引入参数lambda,使得该问题得到解决。在R语言中,MASS包中的函数lm.ridge()可以很方便的完成。它的输入矩阵X始终为n x p 维,不管是否包含常数项。


Usage
lm.ridge(formula, data, subset, na.action, lambda = 0, model = FALSE,
         x = FALSE, y = FALSE, contrasts = NULL, ...)
 
R语言做岭回归_第1张图片


> install.packages("MASS")
> library('MASS')
> longley 
> names(longley)[1] <- "y"
> lm.ridge(y ~ ., longley)
                             GNP              Unemployed   Armed.Forces    Population       Year              Employed 
2946.85636017    0.26352725    0.03648291    0.01116105       -1.73702984   -1.41879853    0.23128785 
> plot(lm.ridge(y ~ ., longley, lambda = seq(0,0.1,0.001)))
R语言做岭回归_第2张图片

> select(lm.ridge(y ~ ., longley, lambda = seq(0,0.1,0.0001)))
modified HKB estimator is 0.006836982 
modified L-W estimator is 0.05267247 
smallest value of GCV  at 0.0057 

你可能感兴趣的:(R)