r-bigmemory

Example of bigmemory and friends with file backing

隐身守侯 提交于 2019-12-01 06:15:44
I am interested in exploring how R can handle data out-of-memory. I've found the bigmemory package and friends ( bigtabulate and biganalytics ), but was hoping that someone could point me to a worked out example that uses file backing with these packages. Any other out-of-memory tips would also be appreciated. Dirk Eddelbuettel Charlie, just email Mike and Jay , they have a number of examples working around the ASA 'flights' database example from a year or two ago. Edit: In fact, the Documentation tab has what I had in mind; the scripts are also on the site. Take a look at " CRAN Task View:

Shared memory in parallel foreach in R

醉酒当歌 提交于 2019-11-28 16:52:58
Problem Description: I have a big matrix c , loaded in RAM memory. My goal is through parallel processing to have read only access to it. However when I create the connections either I use doSNOW , doMPI , big.matrix , etc the amount to ram used increases dramatically. Is there a way to properly create a shared memory, where all the processes may read from, without creating a local copy of all the data? Example: libs<-function(libraries){# Installs missing libraries and then load them for (lib in libraries){ if( !is.element(lib, .packages(all.available = TRUE)) ) { install.packages(lib) }

Shared memory in parallel foreach in R

浪尽此生 提交于 2019-11-27 10:02:06
问题 Problem Description: I have a big matrix c , loaded in RAM memory. My goal is through parallel processing to have read only access to it. However when I create the connections either I use doSNOW , doMPI , big.matrix , etc the amount to ram used increases dramatically. Is there a way to properly create a shared memory, where all the processes may read from, without creating a local copy of all the data? Example: libs<-function(libraries){# Installs missing libraries and then load them for

Calculate Euclidean distance matrix using a big.matrix object

Deadly 提交于 2019-11-27 03:38:05
问题 I have an object of class big.matrix in R with dimension 778844 x 2 . The values are all integers (kilometres). My objective is to calculate the Euclidean distance matrix using the big.matrix and have as a result an object of class big.matrix . I would like to know if there is an optimal way of doing that. The reason for my choice of using the class big.matrix is memory limitation. I could transform my big.matrix to an object of class matrix and calculate the Euclidean distance matrix using