
吴恩达机器学习
吴恩达机器学习
han_hhh
这个作者很懒,什么都没留下…
展开
-
目标检测学习路线【基础知识+论文入门】
初步看完机器学习之后,开始看目标检测论文之前我按照下列顺序进行了知识补充按照下列顺序做了知识补充:感知机原理小结深度神经网络(DNN)模型与前向传播算法深度神经网络(DNN)反向传播算法(BP)深度神经网络(DNN)损失函数和激活函数的选择深度神经网络(DNN)的正则化卷积神经网络(CNN)模型结构卷积神经网络(CNN)前向传播算法卷积神经网络超详细介绍...原创 2020-05-05 00:01:51 · 7905 阅读 · 0 评论 -
Week8测验:Anomaly Detection【Maching Learning】
1、For which of the following problems would anomaly detection be a suitable algorithm?A、Given an image of a face, determine whether or not it is the face of a particular famous individual.B、Give...原创 2020-05-03 17:11:04 · 732 阅读 · 0 评论 -
Programming Exercise 7: K-means Clustering and Principal Component Analysis【Maching Learning】
1.1.1 Finding closest centroidsn=size(X,1);temp = zeros(K, 1);for i=1:n for j=1:K% temp(j)=(X(i,1)-centroids(j,1))^2+(X(i,2)-centroids(j,2))^2;% //上边这句提交总没分,其实我是很迷惑的。。我觉得他...原创 2020-05-02 12:25:41 · 299 阅读 · 0 评论 -
Week 8 测验:Principal Component Analysis【Maching Learning】
4.第 4 个问题Which of the following statements are true? Check all that apply.A 、Given only and Ureduce, there is no way to reconstruct any reasonable approximation to 不正确B、PCA is susceptible to l...原创 2020-04-30 23:46:37 · 412 阅读 · 2 评论 -
Programming Exercise 6: Support Vector Machines
1.2.1 Gaussian Kernel高斯核求解公式sim=exp(-(x1-x2)'*(x1-x2)/2/sigma/sigma)原创 2020-04-23 11:02:59 · 358 阅读 · 0 评论 -
Programming Exercise 5: Regularized Linear Regression and Bias v.s. Variance
1.2 Regularized linear regression cost function1.3 Regularized linear regression gradient %刚开始没看到传进来的参数本身就已经给X加了一列了J=1/2/m*(X*theta-y)'*(X*theta-y)+lambda/2/m*theta(2:end)'*theta(2:end);grad=...原创 2020-04-14 18:46:21 · 359 阅读 · 0 评论 -
Week 6 测验:Advice for Applying Machine Learning【Maching Learning】
1You train a learning algorithm, and find that it has unacceptably high error on the test set. You plot the learning curve, and obtain the figure below. Is the algorithm suffering from high bias, hi...原创 2020-04-14 16:23:29 · 1237 阅读 · 0 评论 -
Week 5 编程作业
1.3 Feedforward and cost function 1、前向传播,从输入层到隐藏层,再到输出层,一步步计算,得出h2、输入该函数的y是一个5000*1的矩阵,需要把他变成5000*10的矩阵,如下3、利用代价函数公式计算J%nnCostFunction.m%compute hX=[ones(m,1),X];a1=X;z2=a1*Theta1'...原创 2020-04-13 23:18:09 · 176 阅读 · 0 评论 -
Week 4 编程作业
lrCostFunction.m和ex-2里的一样,这次还多了讲解n=size(X,2);h=sigmoid(X*theta);J=-1/m*(y'*log(h)+(1-y)'*log(1-h))+lambda/2/m*theta(2:n)'*theta(2:n);grad=1/m*(X'*(h-y))+lambda/m*theta;grad(1)=1/m*(X(:,1)...原创 2020-04-05 15:49:30 · 133 阅读 · 0 评论 -
Week 4 测验: Neural Networks:Representation【Machine Learning】
正确答案:BDC选项的错误原因,刚开始觉得既然是多分类问题,到最后肯定会满足h(x)只有一个1其他都是0,但是他们实际上是没有关系的正确答案:A根据给的theta值写出来h(x)的表达式,代入00、01、10、11,算一下结果,看和那个选项一致3、You have the following neural network:You'd like to com...原创 2020-04-05 10:57:18 · 220 阅读 · 0 评论 -
Week3 编程作业
sigmoid.m公式:g=1./(1+exp(-z))要注意是./,因为参数是矩阵时要对每个元素求其sigmoidcostFunction.m发现自从把第一章的向量化想出来 后来就没那么难了h=sigmoid(X*theta);J=-1/m*(y'*log(h)+(1-y)'*log(1-h))grad=1/m*(X'*(h-y))predi...原创 2020-04-04 19:50:38 · 187 阅读 · 0 评论 -
Week 3 测验 Regularization【Maching learning】
正确答案:CA的错误原因:将正则化方法加入模型并不是每次都能取得好的效果,如果λλ取得太大的化就会导致欠拟合. 这样不论对traing set 还是 examples都不好正确答案:AD的错误原因:正则化和J(theta)收不收敛没关系...原创 2020-04-04 13:12:27 · 182 阅读 · 0 评论 -
Week 3测验 Logistic Regression【Machine Learning】
正确答案:ACBD中h(x)是线性回归的正确答案:BC正确答案:AB原创 2020-04-03 23:11:07 · 321 阅读 · 0 评论 -
Week2 编程作业
单变量computeCost.mJ=(X*theta-y).^2;J=1/(2*m)*sum(J);直接按代价函数的公式敲代码gradientDescent.m需添加的代码 temp0=theta(1)-alpha/m*sum((X*theta-y).*X(:,1)); temp1=theta(2)-alpha/m*sum((X*theta-...原创 2020-04-03 21:49:20 · 215 阅读 · 0 评论 -
Maching Learning 错题整理
Week2-Linear Regression with Multiple VariablesWhich of the following are reasons for using feature scaling?A.It prevents the matrix XTX(used in the normal equation) from being non-invertable (sin...原创 2020-03-31 20:42:11 · 351 阅读 · 0 评论