入门,
In this article, I learned how to load data and the model. More importantly, I learned how to save a model:
mindspore.save_checkpoint(model, "model.ckpt")
for model training ,here are some steps:
forward function:
def forward_fn(data, label):
logits = model(data)
loss = loss_fn(logits, label)
return loss, logits
A somewhat complex function grad_fn that is used to calc gradiant:
grad_fn = mindspore.value_and_grad(forward_fn, None, optimizer.parameters, has_aux=True)
forward_fn, optimizer( like SGD) are in above.
Finally we can define a train_step:
def train_step(data, label):
(loss, _) , grads = grad_fn(data, label)
optimizer(grads)
return loss
and the final train:
def train(model, dataset):
size= dataset.get_dataset_size()
model.set_train()
for batch, (data, label) in emuerate(dataset.create_tuple_iterator()):
loss = train_step(data, label)
if batch % 100 == 0:
loss, current = loss.asnumpy(),batch
print(f"loss:{loss:>7f} [{current:>3d}/{size:>3d}]")
This is the train step. It is a begin.