Does Back Propagation and Gradient Descent use the same logic?

后端 未结 0 713
慢半拍i
慢半拍i 2021-02-15 13:10

Back Propagation is used in CNN to update the randomly allotted weights, biases and filters. For updation of values, we find Gradient using chain rule from end to start and use

相关标签:
回答
  • 消灭零回复
提交回复
热议问题