I am doing a simulation of a GARCH model. The model itself is not too relevant, what I would like to ask you is about optimizing the simulation in R. More than anything if y
building on Vincent's response, all I changed was dfining zt
all at once and switching apply(ret, 1, sum)
to rowSums(ret)
and it sped up quite a bit. I tried both compiled, but no major diff:
randhelp2 <- function(horizon = 5, N = 1e4, h0 = 2e-4,
mu = 0, omega = 0, alpha1 = 0.027,
beta1 = 0.963 ){
ret <- et <- ht <- matrix(NA, nc = horizon, nr = N)
zt <- matrix(rnorm(N * horizon, 0, 1), nc = horizon)
ht[, 1] <- h0
for(j in 1:horizon){
et[, j] <- zt[, j] * sqrt(ht[, j])
ret[,j] <- mu + et[, j]
if( j < horizon )
ht[, j + 1] <- omega + alpha1 * et[, j] ^ 2 + beta1 * ht[, j]
}
rowSums(ret)
}
system.time(replicate(10,randhelp(N=1e5)))
user system elapsed
7.413 0.044 7.468
system.time(replicate(10,randhelp2(N=1e5)))
user system elapsed
2.096 0.012 2.112
likely still room for improvement :-)