Orthogonal regression fitting in scipy least squares method

后端 未结 3 931
离开以前
离开以前 2021-02-19 16:21

The leastsq method in scipy lib fits a curve to some data. And this method implies that in this data Y values depends on some X argument. And calculates the minimal distance bet

相关标签:
3条回答
  • 2021-02-19 16:46

    scipy.odr implements the Orthogonal Distance Regression. See the instructions for basic use in the docstring and documentation.

    0 讨论(0)
  • 2021-02-19 17:01

    I've found the solution. Scipy Odrpack works noramally but it needs a good initial guess for correct results. So I divided the process into two steps.

    First step: find the initial guess by using ordinaty least squares method.

    Second step: substitude these initial guess in ODR as beta0 parameter.

    And it works very well with an acceptable speed.

    Thank you guys, your advice directed me to the right solution

    0 讨论(0)
  • 2021-02-19 17:05

    If/when you are able to invert the function described by p you may just include x-pinverted(y) in mFunc, I guess as sqrt(a^2+b^2), so (pseudo code)

    return sqrt( (y - (p[0]*x**p[1]))^2 + (x - (pinverted(y))^2)
    

    for example for

    y=kx+m   p=[m,k]    
    pinv=[-m/k,1/k]
    
    return sqrt( (y - (p[0]+x*p[1]))^2 + (x - (pinv[0]+y*pinv[1]))^2)
    

    But what you ask for is in some cases problematic. For example, if a polynomial (or your x^j) curve has a minimum ym at y(m) and you have a point x,y lower than ym, what kind of value do you want to return? There's not always a solution.

    0 讨论(0)
提交回复
热议问题