在加载预训练权重时可能会遇到类似下面的错误:
optimizer.load_state_dict(checkpoint['optimizer_state'])
File "/opt/conda/lib/python3.8/site-packages/torch/optim/optimizer.py", line 145, in load_state_dict
raise ValueError("loaded state dict contains a parameter group "
ValueError: loaded state dict contains a parameter group that doesn't match the size of optimizer's group
遇到这个问题时你去网上看,一般都是泛泛的说原因是因为模型的参数和优化器的参数不匹配,看完保证你还是一头雾水,一般遇到这样情况我干脆直接去翻看torch/optim/optimizer.py里的源码比看网上七嘴八舌甚至胡说八道的瞎说好,例如我使用的pytorch的optimizer.py的源码是这样的:
class Optimizer:
r"""Base class for all optimizers.
.. warning::
Parameters need to be specified as collections that have a deterministic
ordering that is consistent between runs. Examples of objects that don't
satisfy those properties are sets and iterators over values of dictionaries.
Args:
params (iterable): an iterable of :class:`torch.Tensor` s or
:c

文章讲述了在使用PyTorch加载预训练权重时遇到optimizer参数组不匹配错误的原因,包括模型结构不一致,参数数量不符等,并提供了调整加载方法的建议。
最低0.47元/天 解锁文章
773

被折叠的 条评论
为什么被折叠?



