李宏毅2020机器学习2
- 10. Classification
- 11. Logistic Regression
- 12. Deep Learning
- 13. Backpropagation
- 14. Tips for Deep Learning
- 15. 为什么选择deep
- 16. PyTorch 介绍
-
- 16.1. tensor.view
- 16.2. Broadcasting
- 16.3. Computation graphs
- 16.4. CUDA Semantics
- 16.5. PyTorch是一个自动微分框架
- 16.6. 梯度的使用
- 16.7. Linear Regression
- 16.8. torch.nn.Module
- 16.9. Activation functions
- 16.10. Sequential
- 16.11. Loss functions
- 16.12. optim
- 16.13. 使用PyTorch实现线性回归
- 16.14. 神经网络
- 16.15. CrossEntropyLoss
- 17. Convolutional Neural Network
- 18. Graph Neural Network
10. Classification
分类应用很多,例如:
Credit scoring:Input: income, savings, profession, age, …, Output: accept or refuse
Medical Diagnosis: Input: current symptoms, age, gender, …, Output: which kind of diseases
Handwritten character recognition
Face recognition: Input: image of a face, Output: persion
10.1 Generative model 生成模型
高斯分布
贝叶斯概率
极大释然估计
11. Logistic Regression
Step 1: Function Set
f w , b ( x ) = P w , b ( C 1 ∣ x ) = σ ( z ) f_{w,b}(x)=P_{w,b}(C_1|x)=\sigma(z) fw,b(x)=Pw,b(C1∣x)=σ(z)
σ ( z ) = 1 1 + e − z \sigma (z) = \frac{1}{1+e^{-z}} σ(z)=1+e−z1
z = w ⋅ x + b = ∑ i w i x i + b z=w \cdot x + b = \sum_i w_i x_i + b z=w⋅x+b=∑iwixi+b
sigmoid 函数默认是高斯分布
Step 2: Goodness of a Function
损失函数可以判断函数的好坏
Step 3: Find the best function
梯度下降法找最佳函数
- | Logistic Regression | Linear Regression |
---|---|---|
Step 1 | f w , b ( x ) = σ ( ∑ i w i x i + b ) Output: between 0 and 1 f_{w,b}(x)=\sigma(\sum_i w_i x_i + b) \\ \text{Output: between 0 and 1} fw,b(x)=σ(∑iwixi+b)Output: between 0 and 1 | f w , b ( x ) = ∑ i w i x i + b Output: any value f_{w,b}(x)=\sum_i w_i x_i + b \\ \text{Output: any value} fw,b(x)=∑iwixi+bOutput: any value |
Step 2 | Training data: ( x n , y ^ n ) y ^ n : 1 for class 1, 0 for class 2 L ( f ) = |