I am running into issues trying to use large objects in R. For example:
> memory.limit(4000)
> a = matrix(NA, 1500000, 60)
> a = matrix(NA, 2500000,
Consider whether you really need all this data explicitly, or can the matrix be sparse? There is good support in R (see Matrix
package for e.g.) for sparse matrices.
Keep all other processes and objects in R to a minimum when you need to make objects of this size. Use gc()
to clear now unused memory, or, better only create the object you need in one session.
If the above cannot help, get a 64-bit machine with as much RAM as you can afford, and install 64-bit R.
If you cannot do that there are many online services for remote computing.
If you cannot do that the memory-mapping tools like package ff
(or bigmemory
as Sascha mentions) will help you build a new solution. In my limited experience ff
is the more advanced package, but you should read the High Performance Computing
topic on CRAN Task Views.