In the PPT, andrew tells us that we can choose the new feature which means we can choose x x^2 x^3 as our feature.(For example , the hypohesis = x0 +theta1 *x1 +theta2*x2 ... now we can change it into hypohesis = x0 +theta1 *(x1^2) +theta2*(x2^2))
Besides gradient decent , today i would introduce a new method to minimize the cost function. That's called normal equation(It's only useful to the linear regression problem) In order to apply to this method , we need to .
First we define :

X is called design matrix , then we can the following formular .
The normal quation method is useful when numbers of the samples and features are both not too large. Beacause theta needed to calculate the inverse of the matrix.
本文介绍了线性回归中的一种新特征选择方法,如使用x^2或x^3作为特征,并提出了一种不同于梯度下降法的新方法——正规方程法来最小化代价函数。正规方程法适用于样本数和特征数都不太多的情况。
4048

被折叠的 条评论
为什么被折叠?



