Optimization algorithm (dog-leg trust-region) in Matlab and Python

China☆狼群 提交于 2019-12-06 11:46:11

问题


I'm trying to solve a set of nonlinear equations using the dog-leg trust-region algorithm in Matlab and Python.

In Matlab there is fsolve where this algorithm is the default, whereas for Python we specify 'dogleg' in scipy.optimize.minimize. I won't need to specify a Jacobian or Hessian for the Matlab whereas Python needs either one to solve the problem.

I don't have the Jacobian/Hessian so is there a way around this issue for Python? Or is there another function that performs the equivalent of Matlab's dog-leg method in fsolve?


回答1:


In newer versions of scipy there is the approx_fprime function. It computes a numerical approximation of the jacobian of function f at position xk using the foward step finite difference. It returns an ndarray with the partial derivate of f at positions xk.

If you can't upgrade your version of scipy, you can always copy the implementation from scipy's source.


Edit:

scipy.optimize.minimize calls approx_fprime internally if the input jac=False. So in your case, it should be enough to do the following:

scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=False)

Edit

scipy does not seem to handle the jac=False condition properly so it is necessary to build a callable jac using approx_fprime as follows

jac = lambda x,*args: scipy.optimize.approx_fprime(x,fun,epsilon,*args)
scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=jac)


来源:https://stackoverflow.com/questions/40975133/optimization-algorithm-dog-leg-trust-region-in-matlab-and-python

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!