How to compute standard error from ODR results?

前端 未结 1 1652
[愿得一人]
[愿得一人] 2020-12-06 12:06

I use scipy.odr in order to make a fit with uncertainties on both x and y following this question Correct fitting with scipy curve_fit including errors in x?

相关标签:
1条回答
  • 2020-12-06 12:49

    The reason for the discrepancy is that sd_beta is scaled by the residual variance, whereas cov_beta isn't.

    scipy.odr is an interface for the ODRPACK FORTRAN library, which is thinly wrapped in __odrpack.c. sd_beta and cov_beta are recovered by indexing into the work vector that's used internally by the FORTRAN routine. The indices of their first elements in work are variables named sd and vcv (see here).

    From the ODRPACK documentation (p.85):

    WORK(SDI) is the first element of a p × 1 array SD containing the standard deviations ̂σβK of the function parameters β, i.e., the square roots of the diagonal entries of the covariance matrix, where

    WORK(SDI-1+K) = SD(K) = ̂V 1/2 β (K, K) = ̂σβK
    

    for K = 1,... ,p.

    WORK(VCVI) is the first element of a p × p array VCV containing the values of the covariance matrix of the parameters β prior to scaling by the residual variance, where

    WORK(VCVI-1+I+(J-1)*(NP)) = VCV(I,J) = ̂σ⁻²V β(I, J)
    

    for I = 1,... ,p and J = 1,... ,p.

    In other words, np.sqrt(np.diag(output.cov_beta * output.res_var)) will give you the same result as output.sd_beta.

    I've opened a bug report here.

    0 讨论(0)
提交回复
热议问题