-
【方法一:
ReduceLROnPlateau
】当设定指标在最近几个epoch中都没有变化时,调整学习率。optimizer = torch.optim.SGD(net.parameters(), lr=learning_rate, weight_decay=0.01) scheduler = torch.optim.lr_sheduler.ReduceLROnPlateau(optimizer
【方法一:ReduceLROnPlateau
】当设定指标在最近几个epoch中都没有变化时,调整学习率。
optimizer = torch.optim.SGD(net.parameters(), lr=learning_rate, weight_decay=0.01)
scheduler = torch.optim.lr_sheduler.ReduceLROnPlateau(optimizer