Python- scipy ODR going crazy
问题 I would like to use scipy's ODR to fit a curve to a set of variables with variances. In this case, I am fitting a linear function with a set Y axis crossing point (e.g. a*x+100 ). Due to my inability to find an estimator (I asked about that here), I am using scipy.optimize curve_fit to estimate the initial a value. Now, the function works perfectly without standard deviation, but when I add it, the output makes completely no sense (the curve is far above all of the points). What could be the