Why my scipy.optimize.minimize fails?

只愿长相守 提交于 2019-12-11 12:48:32

问题


I try with fmin_bfgs to find the local minimum of the absolute function abs(x). The initial point is set to 100.0; the expected answer is 0.0. However, I get:

In [184]: op.fmin_bfgs(lambda x:np.abs(x),100.0)
Warning: Desired error not necessarily achieved due to precision loss.
         Current function value: 100.000000
         Iterations: 0
         Function evaluations: 64
         Gradient evaluations: 20
Out[184]: array([100.0])

Why?


回答1:


Methods like fmin_bfgs and fmin_slsqp require smooth (continuous derivative) functions in order to provide reliable results. abs(x) has a dicontinuous derivative at its minimum. A method like the Nelder-Mead simplex, which doesn't require continuous derivatives, might provide better results in this case.



来源:https://stackoverflow.com/questions/28002091/why-my-scipy-optimize-minimize-fails

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!