Write to .data instead? [RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.]
如题:当进行下述操作时,
my_adam = optim.Adam([z], lr=self.lr_shape)
# and this comes in the training loop:
loss.backward()
my_adam.step()
z[some_indices] = z[some_indices] + my_noise_tensor
报错:
RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.
解决措施:
z.data[some_indices] = z[some_indices] + my_noise_tensor
或者
with torch.no_grad():
z[some_indices] += my_noise_tensor