
Machine Learning
文章平均质量分 80
亚尔诺炽焰
这个作者很懒,什么都没留下…
展开
专栏收录文章
- 默认排序
- 最新发布
- 最早发布
- 最多阅读
- 最少阅读
-
[Coursera机器学习]Logistic Regression WEEK3编程作业
1.1 Visualizing the dataTo help you get more familiar with plotting, we have left plotData.m empty so you can try to implement it yourself. However, this is an optional(ungraded) exercise. We also prov原创 2016-09-15 20:27:41 · 2867 阅读 · 1 评论 -
[Coursera机器学习]Linear Regression WEEK2编程作业
1 InitIn the file warmUpExercise.m, you will find the outline of an Octave/MATLAB function. Modify it to return a 5x5 identity matrix by lling in the following code:A = eye(5);2.1 Plotting the DataBe原创 2016-09-05 15:55:38 · 4690 阅读 · 6 评论 -
[Coursera机器学习]Regularized Linear Regression and Bias v.s. Variance WEEK6编程作业
1.2 Regularized linear regression cost functionRecall that regularized linear regression has the following cost function:J(θ)=12m(∑mi=1(hθ(x(i))−y(i))2)+λ2m(∑nj=1θ2j)J(\theta )=\frac{1}{2m}(\sum_{i=1}^原创 2016-11-13 14:06:38 · 2639 阅读 · 0 评论 -
[Coursera机器学习]Multi-class Classication and Neural Networks WEEK4编程作业
1.3.3 Vectorizing regularized logistic regressionNow modify your code in lrCostFunction.m to account for regularization.Once again, you should not put any loops into your code. 此处参考上一周加入正则化的逻辑回归代码即可。n原创 2016-10-08 18:58:16 · 2143 阅读 · 0 评论 -
[Coursera机器学习]Neural Networks Learning WEEK5编程作业
1.3 Feedforward and cost functionRecall that the cost function for the neural network (without regularization) is J(θ)=1m∑mi=1∑Kk=1[−y(i)klog((hθ(x(i)))k)−(1−y(i)k)log(1−(hθ(x(i)))k)]J(\theta ) = \fra原创 2016-11-01 22:55:58 · 3187 阅读 · 0 评论 -
[Coursera机器学习]Support Vector Machines WEEK7编程作业
1.2 SVM with Gaussian KernelsYou should now complete the code in gaussianKernel.m to compute the Gaussian kernel between two examples, (x(i); x(j)). The Gaussian kernel function is dened as:Kgaussian(原创 2016-12-10 21:40:58 · 3871 阅读 · 2 评论 -
[Coursera机器学习]K-means Clustering and Principal Component Analysis WEEK8编程作业
1.1.1 Finding closest centroidsYour task is to complete the code in findClosestCentroids.m. This function takes the data matrix X and the locations of all centroids inside centroids and should output a原创 2017-01-28 21:51:16 · 2436 阅读 · 0 评论 -
CNN误差反传时旋转卷积核的简明分析
转载自:http://blog.youkuaiyun.com/zy3381/article/details/44409535 CNN(卷积神经网络)的误差反传(error back propagation)中有一个非常关键的的步骤就是将某个卷积(Convolve)层的误差传到前一层的池化(Pool)层上,因为在CNN中是2D反传,与传统神经网络中的1D反传有点细节上的不同,下面通过一个简单的例子来详细分解一转载 2017-03-15 13:54:07 · 3068 阅读 · 1 评论 -
[Coursera机器学习]Anomaly Detection and Recommender Systems WEEK9编程作业
1 Anomaly DetectionYour task is to complete the code in estimateGaussian.m. This function takes as input the data matrix X and should output an n-dimension vector mu that holds the mean of all the n fe原创 2017-07-29 14:42:05 · 1717 阅读 · 0 评论