ff package write error

前端 未结 3 655
-上瘾入骨i
-上瘾入骨i 2021-01-23 01:24

I\'m trying to work with a 1909x139352 dataset using R. Since my computer only has 2GB of RAM, the dataset turns out to be too big (500MB) for the conventional methods. So I dec

3条回答
  •  清酒与你
    2021-01-23 01:46

    I recently encountered this problem with a data frame that had ~ 3,000 columns. The easiest way to get around this is to adjust the maximum number of files allowed open for your user account. The typical system is set to ~ 1024 and that is a very conservative limit. Do note that it is set to prevent resource exhaustion on the server.

    On Linux:

    Add the following to your /etc/security/limits.conf file.

    youruserid hard nofile 200000 # you may enter whatever number you wish here youruserid soft nofile 200000 # whatever you want the default to be for each shell or process you have running

    On OS X:

    Add or edit the following in your /etc/sysctl.con file. kern.maxfilesperproc=200000 kern.maxfiles=200000

    You'll need to log out and log back in but then the original poster would be able to use the ffdf to open his 139352 column data frame.

    I've posted more about my run-in with this limit here.

提交回复
热议问题