神奇的SoftMax regression,搞了一晚上搞不定,凌晨3点起来继续搞,刚刚终于调通。我算是彻底理解了,哈哈。代码试验了Andrew Ng的第四课上提到的SoftMax regression算法,并参考了http://ufldl.stanford.edu/wiki/index.php/Softmax_Regression
最终收敛到这个结果,巨爽。
smaple 0: 0.983690,0.004888,0.011422,likelyhood:-0.016445
smaple 1: 0.940236,0.047957,0.011807,likelyhood:-0.061625
smaple 2: 0.818187,0.001651,0.180162,likelyhood:-0.200665
smaple 3: 0.000187,0.999813,0.000000,likelyhood:-0.000187
smaple 4: 0.007913,0.992087,0.000000,likelyhood:-0.007945
smaple 5: 0.001585,0.998415,0.000000,likelyhood:-0.001587
smaple 6: 0.020159,0.000001,0.979840,likelyhood:-0.020366
smaple 7: 0.018230,0.000000,0.981770,likelyhood:-0.018398
smaple 8: 0.025072,0.000000,0.974928,likelyhood:-0.025392
#include "stdio.h"
#include "math.h"
double matrix[9][4]={
{1,47,76,24}, //include x0=1
{1,46,77,23},
{1,48,74,22},
{1,34,76,21},
{1,35,75,24},
{1,34,77,25},
{1,55,76,21},
{1,56,74,22},
{1,55,72,22},
};
double result[]={1,
1,
1,
2,
2,
2,
3,
3,
3,};
double theta[2][4]={
{0.3,0.3,0.01,0.01},
{0.5,0.5,0.01,0.01}}; // include theta0
double function_g(double x)
{
double ex = pow(2.718281828,x);
return ex/(1+ex);
}
do

博主分享了在深夜成功调试并理解SoftMax回归的过程,详细介绍了试验Andrew Ng课程中提及的SoftMax回归算法。通过实验数据展示,如sample 0到sample 8的分类概率和对数似然值,证明了算法的收敛性和效果,表达了实现后的满足感。
最低0.47元/天 解锁文章
1863

被折叠的 条评论
为什么被折叠?



