Manually update momentum terms in pytorch optimizers

后端 未结 0 1580
清酒与你
清酒与你 2021-02-11 07:59

The Adam optimizer has several terms that are used to add "momentum" to the gradient descent algorithm, making the step size for each variable adaptive:

Specifi

相关标签:
回答
  • 消灭零回复
提交回复
热议问题