Backward propagation in Keras?

こ雲淡風輕ζ 提交于 2021-02-06 15:11:41

问题


can anyone tell me how is backpropagation done in Keras? I read that it is really easy in Torch and complex in Caffe, but I can't find anything about doing it with Keras. I am implementing my own layers in Keras (A very beginner) and would like to know how to do the backward propagation.

Thank you in advance


回答1:


You simply don't. (Late edit: except when you are creating custom training loops, only for advanced uses)

Keras does backpropagation automatically. There's absolutely nothing you need to do for that except for training the model with one of the fit methods.

You just need to take care of a few things:

  • The vars you want to be updated with backpropagation (that means: the weights), must be defined in the custom layer with the self.add_weight() method inside the build method. See writing your own keras layers.
  • All calculations you're doing must use basic operators such as +, -, *, / or backend functions. By backend, tensorflow/theano/CNTK functions are also supported.

This is all you need to have the automatic backpropagation working properly.

If your layers don't have trainable weights, you don't need custom layers, create Lambda layers instead (only calculations, no trainable weights).



来源:https://stackoverflow.com/questions/47416861/backward-propagation-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!