Logistic regression

- This the perceptron with a sigmoid activation
- It actually computes the probability that the input belongs to class 1
- Decision boundaries may be obtained by comparing the probability to a threshold
- These boundaries will be lines (hyperplanes in higher dimensions)
- The sigmoid perceptron is a linear classifier
Estimating the model
- Given: Training data: (X1,y1),(X2,y2),…,(XN,yN)\left(X_{1}, y_{1}\right),\left(X_{2}, y_{2}\right), \ldots,\left(X_{N}, y_{N}\right)(X1,y1),(X2,y2),…,(XN,yN)
- XXX are vectors, yyy are binary (0/1) class values
- Total probability of data
P((X1,y1),(X2,y2),…,(XN,yN))=∏iP(Xi,yi)=∏iP(yi∣Xi)P(Xi)=∏i11+e−yi(w0+wTXi)P(Xi) \begin{array}{l} P\left(\left(X_{1}, y_{1}\right),\left(X_{2}, y_{2}\right), \ldots,\left(X_{N}, y_{N}\right)\right)= \prod_{i} P\left(X_{i}, y_{i}\right) \\\\ =\prod_{i} P\left(y_{i} \mid X_{i}\right) P\left(X_{i}\right)=\prod_{i} \frac{1}{1+e^{-y_{i}\left(w_{0}+w^{T} X_{i}\right)}} P\left(X_{i}\right) \end{array} P((X1,y1),(X2,y2),…,(XN,yN

本文深入探讨了逻辑回归原理,包括其作为感知器的扩展通过sigmoid激活函数计算输入属于类别1的概率,以及如何通过最大似然估计求解模型参数。此外,还讨论了多层感知器在网络中的作用,特别是在非线性分类任务中的应用。
最低0.47元/天 解锁文章
4682

被折叠的 条评论
为什么被折叠?



