机器学习笔记-1 Linear Regression(week 1)

本文介绍了一维线性回归的概念及其作为监督学习算法的应用,并详细解释了如何通过梯度下降法来最小化代价函数以求得最优参数。

 

1.Linear Regression with One variable

Linear Regression is supervised learning algorithm, Because the data set is given a right answer for each example.

And we are predicting real-valued output so it is a regression problem.

Block Diagram:

 

 

2. Cost Function

Idea: choose Θ0 and Θ1 so that h(x) is close to y for our training example

cost function:

(it a bow-shaped function )

So it became a mathematical problem of minimizing the cost function (Squared error funciton)

3. Gradient Descent

we are using gradient descent to minimize the cost function

Process:

1. Start with some random choice of the theta vector

2. Keep changing the theta vector to reduce J(theta) Until we end up at a minimum

Repeat until convergence:

(the derivative term is the slope of the cost function)

alpha is the learning rate And we need to a aimultaneous update for the theta vector.

1. If alpha is too small, the gradient descent is small

2. If alpha is too larger, gradient descent, it will overshoot the minimum, it may fail to converge.

And taking the derivative, we can get:

Convex function: a bow-shaped function just like the cost function J(theta)

Batch gradient descent: each step of gradient descent uses all the training examples(sum over all the training sample)

 

转载于:https://www.cnblogs.com/climberclimb/p/6790777.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值