ML之PLiR之Glmnet:利用Glmnet算法求解ElasticNet回归类型问题(实数值评分预测)
【摘要】 ML之PLiR之Glmnet算法:利用Glmnet算法求解ElasticNet回归类型问题(实数值评分预测)
目录
输出结果
1、Glmnet算法
实现代码
输出结果
0 21 22 23 34 35 36 37 38 39 210 211 212 213 314 315 216 217 218 219 220 221 222 223 2...
ML之PLiR之Glmnet算法:利用Glmnet算法求解ElasticNet回归类型问题(实数值评分预测)
目录
输出结果
-
0 2
-
1 2
-
2 2
-
3 3
-
4 3
-
5 3
-
6 3
-
7 3
-
8 3
-
9 2
-
10 2
-
11 2
-
12 2
-
13 3
-
14 3
-
15 2
-
16 2
-
17 2
-
18 2
-
19 2
-
20 2
-
21 2
-
22 2
-
23 2
-
24 2
-
25 2
-
26 2
-
27 2
-
28 2
-
29 3
-
30 3
-
31 3
-
32 2
-
33 3
-
34 2
-
35 2
-
36 2
-
37 2
-
38 2
-
39 2
-
40 2
-
41 2
-
42 1
-
43 1
-
44 2
-
45 2
-
46 2
-
47 1
-
48 2
-
49 1
-
50 1
-
51 1
-
52 1
-
53 1
-
54 1
-
55 1
-
……
-
-
95 1
-
96 1
-
97 1
-
98 1
-
99 1
-
['"alcohol"', '"volatile acidity"', '"sulphates"', '"total sulfur dioxide"', '"chlorides"', '"fixed acidity"', '"pH"', '"free sulfur dioxide"', '"residual sugar"', '"citric acid"', '"density"']
1、Glmnet算法
实现代码
-
#calculate starting value for lambda
-
lam = maxXY/alpha
-
-
#this value of lambda corresponds to beta = list of 0's
-
#initialize a vector of coefficients beta
-
beta = [0.0] * ncols
-
-
#initialize matrix of betas at each step
-
betaMat = []
-
betaMat.append(list(beta))
-
-
#begin iteration
-
nSteps = 100
-
lamMult = 0.93 #100 steps gives reduction by factor of 1000 in
-
# lambda (recommended by authors)
-
nzList = []
-
-
for iStep in range(nSteps):
-
#make lambda smaller so that some coefficient becomes non-zero
-
lam = lam * lamMult
-
-
deltaBeta = 100.0
-
eps = 0.01
-
iterStep = 0
-
betaInner = list(beta)
-
while deltaBeta > eps:
-
iterStep += 1
-
if iterStep > 100: break
-
-
#cycle through attributes and update one-at-a-time
-
#record starting value for comparison
-
betaStart = list(betaInner)
-
for iCol in range(ncols):
-
-
xyj = 0.0
-
for i in range(nrows):
-
#calculate residual with current value of beta
-
labelHat = sum([xNormalized[i][k]*betaInner[k]
-
for k in range(ncols)])
-
residual = labelNormalized[i] - labelHat
-
-
xyj += xNormalized[i][iCol] * residual
-
-
uncBeta = xyj/nrows + betaInner[iCol]
-
betaInner[iCol] = S(uncBeta, lam * alpha) / (1 +
-
lam * (1 - alpha))
-
-
sumDiff = sum([abs(betaInner[n] - betaStart[n])
-
for n in range(ncols)])
-
sumBeta = sum([abs(betaInner[n]) for n in range(ncols)])
-
deltaBeta = sumDiff/sumBeta
-
print(iStep, iterStep)
-
beta = betaInner
-
-
#add newly determined beta to list
-
betaMat.append(beta)
-
-
#keep track of the order in which the betas become non-zero
-
nzBeta = [index for index in range(ncols) if beta[index] != 0.0]
-
for q in nzBeta:
-
if (q in nzList) == False:
-
nzList.append(q)
文章来源: yunyaniu.blog.csdn.net,作者:一个处女座的程序猿,版权归原作者所有,如需转载,请联系作者。
原文链接:yunyaniu.blog.csdn.net/article/details/85058190
【版权声明】本文为华为云社区用户转载文章,如果您发现本社区中有涉嫌抄袭的内容,欢迎发送邮件进行举报,并提供相关证据,一经查实,本社区将立刻删除涉嫌侵权内容,举报邮箱:
cloudbbs@huaweicloud.com
- 点赞
- 收藏
- 关注作者
评论(0)