I am working with R "raster" package and have a large raster layer (62460098 cells, 12 Mb for the object). My cell values range from -1 to 1. I have to replace all negative values with a 0 (example: a cell that has -1 as value has to become a 0). I tried to do this:
raster[raster < 0] <- 0
But it keeps overloading my RAM because of the raster size.
OS: Windows 7 64-bits
RAM size: 8GB
Tks!
You can do
r <- reclassify(raster, c(-Inf, 0, 0))
This will work on rasters of any size (no memory limitation)
There are several postings that discuss memory issues and it's not clear if you have attempted any of these, .... but you should. The physical constraints are not clear, so you should edit your question to include size of machine and name of OS being tortured. I don't know how to construct a toybox that lets me do any testing, but one approach that might not blow up RAM use (as much) would be to first construct a set of indices marking the locations to be "zeroed":
idxs <- which(raster <0, arr.ind=TRUE)
gc() # may not be necessary
Then incrementally replace some fraction of locations, say a quarter or a tenth at a time.
raster[ idxs[ 1:(nrow(idxs)/10), ] ] <- 0
The likely problem with any of this is that R's approach to replacement is not "in place" but rather involves the creation of a temporary copy of the objects which is then reassigned to the original. Good Luck.
来源:https://stackoverflow.com/questions/34617176/r-chaging-specific-cell-values-in-a-large-raster-layer