logistic regression:
numpy 向量化运算可以很快地完成切片任务。
如根据条件构造数据集时 取条件union可以通过bool矩阵的加法完成,对偶地intersection可通过bool矩阵乘法完成。(补可以考虑-)
我们在下面的例子中有用到。(以替换之前的代码)
如根据条件构造数据集时 取条件union可以通过bool矩阵的加法完成,对偶地intersection可通过bool矩阵乘法完成。(补可以考虑-)
我们在下面的例子中有用到。(以替换之前的代码)
先调用sklearn 后自己用优化法解 得到相同结果:
from __future__ import division
from sklearn.datasets import load_iris
from sklearn.linear_model import LogisticRegression
import copy
import numpy as np
from scipy.optimize import minimize
iris = load_iris()
data, target = iris.data, iris.target
X = data[(target == 0) + (target == 1)]
y = target[(target == 0) + (target == 1)]
logisticExt = LogisticRegression()
logisticExt.fit(X, y)
y_hat = logisticExt.predict(X)
print "is all predict as y:"
print np.all(y_hat == y)
X_bar = np.append(np.ones([X.shape[0], 1]), X, axis = 1)
y_require = copy.deepcopy(y)
y_require[y_require == 0] = -1
C = 0.1
def func(w):
penalty = np.sum(np.log(np.exp(np.dot(X_bar, w) * y_require * -1) + 1))
return np.linalg.norm(w) / 2 + C * penalty
w0 = np.ones([X_bar.shape[1]])
res = minimize(func, w0, method = 'nelder-mead', options = {'disp': True})
w_hat = res.x
predict_1 = 1 / (1 + np.exp(np.dot(X_bar, w_hat) * -1))
print "is all predict as y :"
print np.all((predict_1 > 0.5) == y)