How to obtain RMSE out of lm result?

后端 未结 4 1252
孤街浪徒
孤街浪徒 2021-02-01 22:06

I know there is a small difference between $sigma and the concept of root mean squared error. So, i am wondering what is the easiest way to obtain

相关标签:
4条回答
  • 2021-02-01 22:27

    To get the RMSE in one line, with just functions from base, I would use:

    sqrt(mean(res$residuals^2))
    
    0 讨论(0)
  • 2021-02-01 22:31

    Just do

    sigma(res) 
    

    An you got it

    0 讨论(0)
  • 2021-02-01 22:36

    I think the other answers might be incorrect. The MSE of regression is the SSE divided by (n - k - 1), where n is the number of data points and k is the number of model parameters.

    Simply taking the mean of the residuals squared (as other answers have suggested) is the equivalent of dividing by n instead of (n - k - 1).

    I would calculate RMSE by sqrt(sum(res$residuals^2) / res$df).

    The quantity in the denominator res$df gives you the degrees of freedom, which is the same as (n - k - 1). Take a look at this for reference: https://www3.nd.edu/~rwilliam/stats2/l02.pdf

    0 讨论(0)
  • 2021-02-01 22:38

    Residual sum of squares:

    RSS <- c(crossprod(res$residuals))
    

    Mean squared error:

    MSE <- RSS / length(res$residuals)
    

    Root MSE:

    RMSE <- sqrt(MSE)
    

    Pearson estimated residual variance (as returned by summary.lm):

    sig2 <- RSS / res$df.residual
    

    Statistically, MSE is the maximum likelihood estimator of residual variance, but is biased (downward). The Pearson one is the restricted maximum likelihood estimator of residual variance, which is unbiased.


    Remark

    • Given two vectors x and y, c(crossprod(x, y)) is equivalent to sum(x * y) but much faster. c(crossprod(x)) is likewise faster than sum(x ^ 2).
    • sum(x) / length(x) is also faster than mean(x).
    0 讨论(0)
提交回复
热议问题