Is Gradient Descent always used during backpropagation for updating weights?

前端 未结 0 1385
滥情空心
滥情空心 2021-02-19 21:01

Gradient Descent, rmsprop, adam are optimizers. Assume I have taken adam or rmsprop optimizer while compiling model i.e model.compile(optimizer = "adam").

My do

相关标签:
回答
  • 消灭零回复
提交回复
热议问题