Where can I have a look at TensorFlow gradient descent main loop?

时光怂恿深爱的人放手 提交于 2019-12-02 10:48:19

问题


(Sorry if this sounds a bit naive) I want to have a look at the meat of the TensorFlow implementation for GradientDescent - and see for myself how are they handling termination condition, step-size adaptiveness, etc. I traced the code down for training_ops.apply_gradient_descent but I can't find the implementation :(


回答1:


TensorFlow Optimizer interface, (which GradientDescentOptimizer implements) defines a a single step of minimization. Termination conditions or adjusting step size is implemented by the user. In MNIST for Beginners tutorial, the termination conditions is "stop after 1000" steps which you can see in for i in range(1000) loop

apply_gradient_descent(a,b,c) is a fused op that multiplies c by b and adds it to a. There are some extra levels of indirection to go from Python wrapper to C++ implementation detailed in Adding a new op HowTo, but as a shortcut you can usually find C++ implementation by converting from snake-case and searching for that, so ApplyGradientDescent in this case. That leads to implementation in tensorflow/core/kernels/training_ops.cc



来源:https://stackoverflow.com/questions/35724469/where-can-i-have-a-look-at-tensorflow-gradient-descent-main-loop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!