
ML-Stanford-Andrew Ng
文章平均质量分 79
Quebradawill
关注PRMLCV,希望多交流。
展开
-
Stanford ML - Lecture 1 - Linear regression with one variable
Model representationCost functionCost function intuition ICost function intuition IIGradient descentstart with some \theta_0, \theta_1keep changing \theta_0, \theta_1 to reduce原创 2013-03-07 19:57:45 · 808 阅读 · 0 评论 -
Stanford ML - Lecture 11 - Large scale machine learning
1. Learning with large datasetsIt's not who has the best algorithm that wins. It's who has the most data2. Stochastic gradient descentbatch gradient descentrepeat原创 2013-03-21 19:12:05 · 794 阅读 · 0 评论 -
Stanford ML - Lecture 9 - Clustering
1. Unsupervised learning introductionsupervised learning - training set: unpervised learning - training set: 2. K-means algorithmrandomly select K cluster centroidsrepeatfor every原创 2013-03-21 09:15:08 · 711 阅读 · 0 评论 -
Stanford ML - Lecture 8 - Support Vector Machines
1. Optimization Objectivelogistic regressionletsupport vector machine2. Large Margin Intuition3. The mathematics behind large margin classification (optional)原创 2013-03-17 22:50:52 · 847 阅读 · 0 评论 -
Stanford ML - Lecture 10 - Dimensionality Reduction
1. Motivation I: Data CompressionReduce data from 2D to 1D2. Motivation II: Data Visualization3. Principal Component Analysis problem formulationreduce from 2D to 1D: find a directio原创 2013-03-21 10:47:07 · 816 阅读 · 0 评论 -
Stanford ML - Lecture 5 - Neural Networks: Learning
1. Cost functionNeural Network (Classification)Binary classification1 output unitMulti-class classification (K classes)K output unitsCost function原创 2013-03-16 18:08:22 · 857 阅读 · 0 评论 -
Stanford ML - Lecture 6 - Advice for applying machine learning
1. Deciding what to try nextDebugging a learning algorithmSuppose you have implemented regularized linear regression to predict housing prices, when you test your hypothesis on a new set of原创 2013-03-17 20:24:29 · 2009 阅读 · 0 评论 -
Stanford ML - Lecture 7 - Machine learning system design
1. Prioritizing what to work on: Spam classification examplecollect lots of datadeveloped sophisticated features based on email routing informationdeveloped sophisticated features for message bo原创 2013-03-17 22:01:22 · 1383 阅读 · 0 评论 -
Stanford ML - Lecture 4 - Neural Networks: Representation
1. Non-linear hypotheseswhy introduce non-linear hypotheses?high dimensional datanon-linear hypotheses2. Neurons and the brainNeural networksorigins: algorithms that try to m原创 2013-03-13 22:33:54 · 879 阅读 · 0 评论 -
Stanford ML - Lecture 3 - Logistic regression
1. Classification2. Hypothesis RepresentationLogistic Regression Modelthe above function is called sigmoid function or logistic functionInterpretation of Hypothesis outputestimated probability tha原创 2013-03-10 21:59:36 · 1384 阅读 · 0 评论 -
Stanford ML - Lecture 2 - Linear regression with multiple variable
Multiple featuresfor convenience of notation, define , the new hypothesis isGradient descent for multiple variablesnew algorithm isGradient descent in practive I: Feature S原创 2013-03-08 22:35:15 · 897 阅读 · 0 评论 -
斯坦福课程ML
1、线性回归、logistic回归和一般回归线性回归函数(如果)损失函数(loss function)或者错误函数(error function)梯度下降法梯度下降法最大的问题是解有可能是局部极小值,这与初始点的选取有关步骤首先对赋值,这个值可以是随机的,也可以让是一个全零的向量改变的值,使得沿梯度下降的的方向减少由于求得的是极小值,因此原创 2013-03-26 19:12:44 · 1141 阅读 · 0 评论