cross entropy or log loss
In binary classification, the output is a single probability value
y
.
For multi-class problem, the output is a vector
y
,
L(x,z)=−∑k=1Kzklnyk
References
- Loss Function Semantics
- Loss Functions for Preference Levels
- Are Loss Functions All the Same?
- How do you decide which loss function to use for machine learning?
- Loss functions; a unifying view
- Softmax vs. Softmax-Loss: Numerical Stability
- Relationship Between Logistic Loss and Cross Entropy Loss
- Why You Should Use Cross-Entropy Error Instead Of Classification Error Or Mean Squared Error For Neural Network Classifier Training
本文深入探讨了交叉熵损失函数在二元分类和多类问题中的应用,包括其数学表达式、损失函数的含义及如何选择合适的损失函数进行机器学习任务。
663

被折叠的 条评论
为什么被折叠?



