l1正则化的方法
regularization_loss = 0
regularization_loss1 = 0
for param in net.parameters():
regularization_loss += torch.sum(abs(param))
#ww=net.parameters#net.weight
loss =loss
+0.25*regularization_loss1+0.5*regularization_loss
l2正则化的方法
regularization_loss = 0
regularization_loss1 = 0
for param in net.parameters():
regularization_loss += torch.sum(abs(param*param))
#ww=net.parameters#net.weight
loss =loss
+0.25*regularization_loss1+0.5*regularization_loss
本文介绍了如何在PyTorch中自定义实现L1和L2正则化的具体步骤,通过遍历网络参数并计算绝对值之和(L1)或平方和(L2)来实现正则化损失,并将其添加到总损失函数中以防止过拟合。
712

被折叠的 条评论
为什么被折叠?



