1.数学含义
y = w * x
2.含义解释
使用pytorch的内置backward函数进行自动求导
3.代码演示
import torch
x_data = [1.0,2.0,3.0]
y_data = [2.0,4.0,6.0]
w = torch.tensor([1.0]) # 版本选择
w.requires_grad = True #是否计算梯度
def forward(x):
return x * w
def loss(x,y):
y_pred = forward(x)
return (y_pred-y) ** 2
print('Predict (before train)', 4,forward(4).item())
for epoch in range(100):
l = loss(1, 2)#初始化参数,避免出错
for x,y in zip(x_data,y_data):
l = loss(x,y)
l.backward()
print('\t',x,y,w.grad.item())
w.data = w.data -0.01 * w.grad.data
w.grad.data.zero_()
print("progress:",epoch,l.item())
print('Predict (before train)', 4,forward(4).item())```
4.结果展示
Predict (before train) 4 4.0
1.0 2.0 -2.0
2.0 4.0 -7.840000152587891
3.0 6.0 -16.228801727294922
progress: 0 7.315943717956543
1.0 2.0 -1.478623867034912
2.0 4.0 -5.796205520629883
3.0 6.0 -11.998146057128906
progress: 1 3.9987640380859375
1.0 2.0 -1.0931644439697266
2.0 4.0 -4.285204887390137
3.0 6.0 -8.870372772216797
···············
···············
···············
progress: 95 9.094947017729282e-13
1.0 2.0 -7.152557373046875e-07
2.0 4.0 -2.86102294921875e-06
3.0 6.0 -5.7220458984375e-06
progress: 96 9.094947017729282e-13
1.0 2.0 -7.152557373046875e-07
2.0 4.0 -2.86102294921875e-06
3.0 6.0 -5.7220458984375e-06
progress: 97 9.094947017729282e-13
1.0 2.0 -7.152557373046875e-07
2.0 4.0 -2.86102294921875e-06
3.0 6.0 -5.7220458984375e-06
progress: 98 9.094947017729282e-13
1.0 2.0 -7.152557373046875e-07
2.0 4.0 -2.86102294921875e-06
3.0 6.0 -5.7220458984375e-06
progress: 99 9.094947017729282e-13
Predict (before train) 4 7.999998569488525
该博客通过一个简单的线性方程 y = w * x 展示了如何在 PyTorch 中使用自动求导机制(backward 函数)进行梯度计算,并应用梯度下降法更新权重 w。代码示例中,初始化权重 w,设置计算梯度,定义损失函数并迭代训练,最终实现了权重的优化,使得预测值逐渐接近真实值。
9万+

被折叠的 条评论
为什么被折叠?



