AI-009: 吴恩达教授(Andrew Ng)的机器学习课程学习笔记38-47

本文深入探讨了Andrew Ng机器学习课程中的神经网络概念,包括非线性假设的重要性、神经元和大脑模仿、模型表示、多类分类、成本函数、反向传播算法、参数展开、梯度检查、随机初始化及自动驾驶应用实例。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

本文是学习Andrew Ng的机器学习系列教程的学习笔记。教学视频地址:

https://study.163.com/course/introduction.htm?courseId=1004570029#/courseDetail?tab=1

38. Neural Networks - Representation - Non-linear hypotheses

Why neural networks?

Simple linear or logistic regression together with adding in maybe the quadratic or the cubic features, that is not the good way to learn complex nonlinear hypotheses when n is large, because just end up with too many features.

for example: Visual recognition

The computation would be very expensive to find and represent all of these features.

39. Neural Networks - Representation - neurons and the brain
 

mimic 模仿 大脑可以根据接入的信号不同产生不同的功能,比如将听觉区域接入视觉信号,就能'看'见;

cut ear or hand neural and connect eye neural, this part of brain will learn to see.(neuro-rewiring experiments重接实验)

try to find out brain’s learning algorithm!

you can plug in almost any sensor to the brain and the brain’s learning algorithm will just figure out how to learn from that data and deal with that data.

40. Neural networks: Representation - model representation I
 

Nucleus

Dendrite input wires

Cell body胞体

Node of Ranvier 兰氏节

Axon output wire

Myelin sheath 髓鞘

Schwann cell

Axon terminal 轴突末梢

one neurons send a little pulse of electricity via its axon to some different neuron’s dendrite.

next show the computational steps that are represented by this diagram.

forward propagation 前向

41. Neural Networks - Representation - examples and intuitions
 

just put a large negative weight in front of the variable you want to negate.

end up with a nonlinear decision boundary 得到一个非线性的决策

each layer compute even more complex functions, then the neural networks can deal with complex question.

41. Neural Networks - Representation - Mult-class classification

42. Neural Networks - Learning- Cost function

42. Neural Networks - Learning - back propagation algorithm
 

back propagation algorithm 反向播算法

the key is how to compute these partial derivative terms.

是如何些偏导项

43. Neural Networks - Learning - Implementation note: unrolling parameters 
 

How to onvert back and forth between the matrix representation of the parameters versus the vector representation of the parameters.

The advantage of the matrix representation is that when your parameters are stored as matrices it’s more convenient when you’re doing forward propagation and back propagation and it’s easier when your parameters are stored as matrices to take advantage of the sort of, vectorized implementations.

The advantage of the vector representation when you have like thetaVec or DVec is that when you are using the advanced optimization algorithms. Those algorithms tend to assume that you have all of your parameters unrolled into a big long vector.

44. Neural Networks - Learning - Gradient checking

45. Neural Networks - Learning - Random Initialization

all choose 0 will not work

we will only get one features

the epsilon here has no relationship with the epsilon in gradient checking.

46. Neural Network - Learning - Putting it together
 

47. Neural Networks - Learning - Autonomouse driving example
 

left top is man and neural network result.

Left bottom is image of road.

ALVINN

see human drive and after 2 minutes will auto drive.

can auto switch one-lade road and two lane-road weight. 单车道、双

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

铭记北宸

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值