How to obtain RMSE out of lm result?

别说谁变了你拦得住时间么 提交于 2019-12-02 21:06:17

Residual sum of squares:

RSS <- c(crossprod(res$residuals))

Mean squared error:

MSE <- RSS / length(res$residuals)

Root MSE:

RMSE <- sqrt(MSE)

Pearson estimated residual variance (as returned by summary.lm):

sig2 <- RSS / res$df.residual

Statistically, MSE is the maximum likelihood estimator of residual variance, but is biased (downward). The Pearson one is the restricted maximum likelihood estimator of residual variance, which is unbiased.


Remark

  • Given two vectors x and y, c(crossprod(x, y)) is equivalent to sum(x * y) but much faster. c(crossprod(x)) is likewise faster than sum(x ^ 2).
  • sum(x) / length(x) is also faster than mean(x).

To get the RMSE in one line, with just functions from base, I would use:

sqrt(mean(res$residuals^2))

I think the other answers might be incorrect. The MSE of regression is the SSE divided by (n - k - 1), where n is the number of data points and k is the number of model parameters.

Simply taking the mean of the residuals squared (as other answers have suggested) is the equivalent of dividing by n instead of (n - k - 1).

I would calculate RMSE by sqrt(sum(res$residuals^2) / res$df).

The quantity in the denominator res$df gives you the degrees of freedom, which is the same as (n - k - 1). Take a look at this for reference: https://www3.nd.edu/~rwilliam/stats2/l02.pdf

Just do

sigma(res) 

An you got it

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!