问题 How do I use a learning rate scheduler with the following optimizer? optimizer = torch.optim.Adam(optim_params,betas=(args.momentum, args.beta), weight_decay=args.weight_decay) I have written the following scheduler: scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=100, gamma=0.9) I am not clear if I should step the scheduler or the optimizer. Which order should I take to perform the following? optimizer.zero_grad() scheduler.step() optimizer.step() 回答1: Since 1.3 the