
scikit learn
Victoria1997
这个作者很懒,什么都没留下…
展开
-
scikit-learn 之岭回归
scikit-learn 之岭回归基本语句from sklearn import linear_modelreg = linear_model.Ridge(alpha = 0.5)reg.fit([[0,0],[1,1],[2,2]],[0,1,2])reg.coef_ #coeficientsreg.intercept_Plot Ridge coefficients as a f...原创 2019-01-11 11:07:33 · 425 阅读 · 0 评论 -
scikitlearn之Lasso回归
lasso回归基本语法from sklearn import linear_modelreg = linear_model.Lasso(alpha=0.1)reg.fit([[0,0],[1,1],[2,2]],[0,1,2])reg.coef_ #coeficientsreg.intercept_reg.predict([[1,1]])和之前的线性回归之类的都很像哦~Lass...原创 2019-01-18 10:30:29 · 1338 阅读 · 1 评论 -
scikit learn之Elastic Net
from itertools import cyclefrom sklearn.linear_model import lasso_path,enet_pathfrom sklearn import datasetsimport numpy as npimport matplotlib.pyplot as plt diabetes = datasets.load_diabetes()...原创 2019-01-25 11:49:34 · 436 阅读 · 0 评论 -
scikit learn 之OLS
scikit learn 之OLS最近刚刚开始学习Python,之前是对R语言进行的机器学习比较熟悉了,所以算法的数学推导还算理解,主要是针对于Python中的函数进行了学习嘿嘿~希望自己能够输出带动学习!!代码from sklearn import linear_modelreg = linear_model.LinearRegression()reg.fit([[0,0],[1,1]...原创 2019-01-10 11:11:39 · 979 阅读 · 0 评论 -
scikitlear之logistics回归
L1 Penalty and Sparsity in Logistic Regression#比较不同值C在L1和L2惩罚下解的稀疏性(零系数百分比)。可以看出,C值越大,模型的自由度越大。#相反,C的值越小,对模型的约束越大。在L1惩罚的情况下,这会导致稀疏解。#我们将8x8的数字图像分为两类:0-4和5-9。可视化显示了C变化模型的系数import numpy as npimpor...原创 2019-01-29 17:11:56 · 457 阅读 · 0 评论 -
scikitlearn之 LDA,QDA
1.2. Linear and Quadratic Discriminant AnalysisLinear and Quadratic Discriminant Analysis with covariance ellipsoidfrom scipy import linalgimport numpy as npimport matplotlib.pyplot as pltimport...原创 2019-02-13 16:34:46 · 782 阅读 · 0 评论