I\'m trying to work with a 1909x139352 dataset using R. Since my computer only has 2GB of RAM, the dataset turns out to be too big (500MB) for the conventional methods. So I dec
I recently encountered this problem with a data frame that had ~ 3,000 columns. The easiest way to get around this is to adjust the maximum number of files allowed open for your user account. The typical system is set to ~ 1024 and that is a very conservative limit. Do note that it is set to prevent resource exhaustion on the server.
Add the following to your /etc/security/limits.conf
file.
youruserid hard nofile 200000 # you may enter whatever number you wish here
youruserid soft nofile 200000 # whatever you want the default to be for each shell or process you have running
Add or edit the following in your /etc/sysctl.con
file.
kern.maxfilesperproc=200000
kern.maxfiles=200000
You'll need to log out and log back in but then the original poster would be able to use the ffdf to open his 139352 column data frame.
I've posted more about my run-in with this limit here.