I am using the StepLR scheduler with the Adam optimizer:
StepLR
Adam
optimizer = torch.optim.Adam(model.parameters(), lr=LrMax, weight_decay