问题
I use parSapply()
from parallel
package in R. I need to perform calculations on huge amount of data. Even in parallel it takes hours to execute, so I decided to regularly write results to a file from clusters using write.table()
, because the process crashes from time to time when running out of memory or for some other random reason and I want to continue calculations from the place it stopped. I noticed that some lines of csv files that I get are just cut in the middle, probably as a result of several processes writing to the file at the same time. Is there a way to place a lock on the file for the time while write.table()
executes, so other clusters can't access it or the only way out is to write to separate file from each cluster and then merge the results?
来源:https://stackoverflow.com/questions/20425071/lock-file-when-writing-to-it-from-parallel-processes-in-r