Logistic回归
一、概览:
二类分: $ y\in{0, 1} $ 的分类问题,0 表示负类,1表示正类
Logistic回归用于解决分类问题
二、假设函数:
sigmoid函数:
When0<=hθ(x)<=1 When 0 <= h_\theta(x) <= 1 When0<=hθ(x)<=1
hθ(x)=g(θTx) h_\theta(x) = g(\theta^Tx) hθ(x)=g(θTx)
g(z)=11+e−θTx g(z) = \frac {1} {1 + e^{-\theta^Tx}} g(z)=1+e−θTx1
三、决策边界:
h(x)正好等于0.5的区域
四、代价函数:
Logistic回归模型的拟合问题描述:
训练集:{(x(1),y(1)),(x(2),y(2)),...,(x(m),y(m))} 训练集:\{(x^{(1)}, y^{(1)}), (x^{(2)}, y^{(2)}), ... , (x^{(m)}, y^{(m)})\} 训练集:{(x(1),y(1)),(x(2),y(2)),...,(x(m),y(m))}
m 个训练样本
x∈(x0,x1,...,xn)T,x0=1,y∈{0,1} x \in (x_0, x_1, ... , x_n)^T, x_0 = 1, y \in\{ 0, 1\} x∈(x0,x1,...,xn)T,x0=1,y∈{0,1}
hθ(x)=11+e−θTx h_\theta(x) = \frac {1} {1 + e^{-\theta^Tx}} hθ(x)=1+e−θTx1
如何选择参数θ 如何选择参数\theta 如何选择参数θ
Logistic损失函数:
Cost(hθ(x),y)=−log(hθ(x)),ify=1 Cost(h_\theta(x), y) = -log(h_\theta(x)), if y = 1Cost(hθ(x),y)=−log(hθ(x)),ify=1
Cost(hθ(x),y)=−log(1−hθ(x)),ify=0 Cost(h_\theta(x), y) = -log(1 - h_\theta(x)), if y = 0 Cost(hθ(x),y)=−log(1−hθ(x)),ify=0
J(θ)=1m∑i=1mCost(hθ(x(i)),y(i))J(\theta) = \frac {1} {m} \sum_{i =1}^mCost(h_\theta(x^{(i)}), y^{(i)}) J(θ)=m1i=1∑mCost(hθ(x(i)),y(i))
简化后:
Cost(hθ(x),y)=−ylog(hθ(x))−(1−y)log(1−hθ(x)) Cost(h_\theta(x), y) = -ylog(h_\theta(x)) - (1 - y)log(1 - h_\theta(x)) Cost(hθ(x),y)=−ylog(hθ(x))−(1−y)log(1−hθ(x))
1612

被折叠的 条评论
为什么被折叠?



