learning-rate

Pytorch: looking for a function that let me to manually set learning rates for specific epochs intervals

瘦欲@ 提交于 2020-07-30 12:56:08
问题 For example, set lr = 0.01 for the first 100 epochs, lr = 0.001 from epoch 101 to epoch 1000, lr = 0.0005 for epoch 1001-4000. Basically my learning rate plan is not letting it decay exponentially with a fixed number of steps. I know it can be achieved by self-defined functions, just curious if there are already developed functions to do that. 回答1: torch.optim.lr_scheduler.LambdaLR is what you are looking for. It returns multiplier of initial learning rate so you can specify any value for any

Pytorch: looking for a function that let me to manually set learning rates for specific epochs intervals

只谈情不闲聊 提交于 2020-07-30 12:54:02
问题 For example, set lr = 0.01 for the first 100 epochs, lr = 0.001 from epoch 101 to epoch 1000, lr = 0.0005 for epoch 1001-4000. Basically my learning rate plan is not letting it decay exponentially with a fixed number of steps. I know it can be achieved by self-defined functions, just curious if there are already developed functions to do that. 回答1: torch.optim.lr_scheduler.LambdaLR is what you are looking for. It returns multiplier of initial learning rate so you can specify any value for any

PyTorch: Learning rate scheduler

流过昼夜 提交于 2020-05-13 05:12:50
问题 How do I use a learning rate scheduler with the following optimizer? optimizer = torch.optim.Adam(optim_params,betas=(args.momentum, args.beta), weight_decay=args.weight_decay) I have written the following scheduler: scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=100, gamma=0.9) I am not clear if I should step the scheduler or the optimizer. Which order should I take to perform the following? optimizer.zero_grad() scheduler.step() optimizer.step() 回答1: Since 1.3 the

PyTorch: Learning rate scheduler

放肆的年华 提交于 2020-05-13 05:12:48
问题 How do I use a learning rate scheduler with the following optimizer? optimizer = torch.optim.Adam(optim_params,betas=(args.momentum, args.beta), weight_decay=args.weight_decay) I have written the following scheduler: scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=100, gamma=0.9) I am not clear if I should step the scheduler or the optimizer. Which order should I take to perform the following? optimizer.zero_grad() scheduler.step() optimizer.step() 回答1: Since 1.3 the

PyTorch: Learning rate scheduler

二次信任 提交于 2020-05-13 05:12:47
问题 How do I use a learning rate scheduler with the following optimizer? optimizer = torch.optim.Adam(optim_params,betas=(args.momentum, args.beta), weight_decay=args.weight_decay) I have written the following scheduler: scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=100, gamma=0.9) I am not clear if I should step the scheduler or the optimizer. Which order should I take to perform the following? optimizer.zero_grad() scheduler.step() optimizer.step() 回答1: Since 1.3 the