文章目录
摘要
本次学习对Pytorch中有关常用的损失函数进行了相关学习和实操,并对Pytorch中交叉熵损失函数的原理进行学习和相关公式的推导;并学习了优化器通过计算模型的损失函数进行模型的优化;同时学习了现在训练成熟的网络模型的使用、修改以及网络模型的保存和读取。
Abstract
In this study, the common loss functions in Pytorch are studied and implemented, and the principle of cross-entropy loss function in Pytorch is studied and related formulas are derived. The optimizer can optimize the model by calculating the loss function of the model. At the same time, we learned how to use and modify the network model and how to save and read the network model.
1 损失函数与反向传播
1.1 L1Loss损失函数
import torch
from torch.nn import L1Loss
inputs = torch.tensor([1, 2, 3], dtype=torch.float32)
targets = torch.tensor([1, 2, 5], dtype=torch.float32)
l1 = L1Loss()
result = l1(inputs, targets)
print(result) # tensor(0.6667)
l1 = L1Loss(reduction=‘mean’)
- 默认reduction=‘mean’,求每个数据差的绝对值再取平均
- 当reduction=‘sum’,即求每个数据差的绝对值求和,此时输出为:tensor(2.)
1.2 MSELoss损失函数
import torch
from torch.nn import L1Loss, MSELoss
inputs = torch.tensor([1, 2, 3], dtype=torch.float32)
targets = torch.tensor([1, 2, 5], dtype=torch.float32)
# MSELoss损失函数
m1 = MSELoss(reduction='mean') #默认值
result = m1(inputs, targets)
print(result) # tensor(1.3333)
m2 = MSELoss(reduction='sum')
result = m2(inputs, targets)
print(result) # tensor(4.)
1.3 交叉熵损失函数(CrossEntropyLoss)
softmax函数又称归一化指数函数,是基于 sigmoid 二分类函数在多分类任务上的推广;在多分类网络中,常用 Softmax 作为最后一层进行分类。
import torch
import torch.nn as nn
input1 = torch.tensor([-0.5, -0.3, 0, 0.3, 0.5])
input2 = torch.tensor([-3,