Orthogonal regression fitting in scipy least squares method

时光总嘲笑我的痴心妄想 提交于 2020-01-01 04:18:11

问题


The leastsq method in scipy lib fits a curve to some data. And this method implies that in this data Y values depends on some X argument. And calculates the minimal distance between curve and the data point in the Y axis (dy)

But what if I need to calculate minimal distance in both axes (dy and dx)

Is there some ways to implement this calculation?

Here is a sample of code when using one axis calculation:

import numpy as np
from scipy.optimize import leastsq

xData = [some data...]
yData = [some data...]

def mFunc(p, x, y):
    return y - (p[0]*x**p[1])  # is takes into account only y axis

plsq, pcov = leastsq(mFunc, [1,1], args=(xData,yData))
print plsq

I recently tryed scipy.odr library and it returns the proper results only for linear function. For other functions like y=a*x^b it returns wrong results. This is how I use it:

def f(p, x):      
    return p[0]*x**p[1]

myModel = Model(f)
myData = Data(xData, yData)
myOdr = ODR(myData, myModel , beta0=[1,1])
myOdr.set_job(fit_type=0) #if set fit_type=2, returns the same as leastsq
out = myOdr.run()
out.pprint()

This returns wrong results, not desired, and in some input data not even close to real. May be, there is some special ways of using it, what do I do wrong?


回答1:


scipy.odr implements Orthogonal Distance Regression. See the instructions for basic use in the docstring:

https://github.com/scipy/scipy/blob/master/scipy/odr/odrpack.py#L27




回答2:


I've found the solution. Scipy Odrpack works noramally but it needs a good initial guess for correct results. So I divided the process into two steps.

First step: find the initial guess by using ordinaty least squares method.

Second step: substitude these initial guess in ODR as beta0 parameter.

And it works very well with an acceptable speed.

Thank you guys, your advice directed me to the right solution




回答3:


If/when you are able to invert the function described by p you may just include x-pinverted(y) in mFunc, I guess as sqrt(a^2+b^2), so (pseudo code)

return sqrt( (y - (p[0]*x**p[1]))^2 + (x - (pinverted(y))^2)

for example for

y=kx+m   p=[m,k]    
pinv=[-m/k,1/k]

return sqrt( (y - (p[0]+x*p[1]))^2 + (x - (pinv[0]+y*pinv[1]))^2)

But what you ask for is in some cases problematic. For example, if a polynomial (or your x^j) curve has a minimum ym at y(m) and you have a point x,y lower than ym, what kind of value do you want to return? There's not always a solution.



来源:https://stackoverflow.com/questions/9376886/orthogonal-regression-fitting-in-scipy-least-squares-method

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!