When I use the following R code,
model_glm=glm(V1~. , data=xx,family=\"binomial\");
save(file=\"modelfile\",model_glm);
The size of modelfile w
I had this issue where I was running the GLM as part of an R in production and the size of the GLM greatly slowed me down. I found I needed to kill off more than just the $data
. Here is my post on it, with an example below.
> object.size(sg)
96499472 bytes
> sg$residuals <- NULL
> sg$weights <- NULL
> sg$fitted.values <- NULL
> sg$prior.weights <- NULL
> sg$na.action<- NULL
> sg$linear.predictors <- NULL
> sg$fitted.values <- NULL
> sg$effects <-NULL
> sg$data <- NULL
> object.size(sg)
3483976 bytes
> sg$qr$qr <- NULL
> object.size(sg)
79736 bytes
object.size()
is misleading because it ignores the environment attributes.
If you want to assess the true size, use:
length(serialize(model_glm, NULL))
Apart from the data stored, if you want to significantly reduce the size of your glm do:
rm(list=ls(envir = attr(model_glm$terms, ".Environment")),
envir = attr(model_glm$terms,
".Environment"))
This comes from a well detailed article
Setting model = FALSE
in your call to glm
should prevent the model.frame
from being returned. Also setting y = FALSE
will prevent the response vector from being returned. x = FALSE
is the default setting and prevents the model.matrix
from being returned.
This combination should shrink the size of your glm object.
Of course, you can also extract the coefficients with coef(model_glm)
or, with standard errors,
summary(model_glm)$coef
You can NULL the data in the model object before saving it. I did a quick test and still generated predictions.
model_glm$data <- NULL