问题
I'm on Windows Server 2012 (64-bit) with 30.5 GB of RAM, running R v3.1.2 in RStudio 0.98, and am still having trouble with R hitting a memory limit.
I reviewed the FAQ here: http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021
Which states that the memory limit on 64-bit instances defaults to the total amount of RAM, and that the limit can be checked and set using memory.limit().
A call to memory.limit()
returns 31249
, confirming that it's able to see and use all 30 gigs.
However, when I make a modeling call on a large dataset (~10M rows):
ctree(as.formula(formula), data=d, control=ctree_control(mincriterion=0.9, minbucket=1000))
I get the following error:
'Calloc' could not allocate memory (18446744073673801728 of 8 bytes)
But looking at the system task manager I can see that over 25GB is still available, and that R is only using 2.3GB.
Running the modeling outside of RStudio and in R directly yields the same result, so RStudio isn't the variable.
I'm perplexed - why does R refuse to use all my memory?
回答1:
The problem was a bug in the C code underlying the ctree()
function (as correctly suspected by @JoshuaUlrich). The reason was an integer overflow that has been fixed now in the libcoin
package version 1.0-2 that the partykit
package builds on.
(Comments: We didn't learn about this bug earlier because the party
tag was not used for the question here on StackOverflow and the problem was not reported to the package maintainer until today. Thanks to Kris Joanidis who reported the problem and also provided a patch, very much appreciated.)
来源:https://stackoverflow.com/questions/27975385/calloc-could-not-allocate-memory-in-64-bit-r