extract from {raster} package using excessive memory

江枫思渺然 提交于 2019-12-12 08:15:50

问题


I have been using the extract function from the raster package to extract data from raster files using an area defined by shapefiles. However, I am having problems with the amount of memory that this process is now requiring. I do have a large number of shapefiles (~1000). The raster files are large (~1.6gb)

My process is:

shp <- mclapply(list.files(pattern="*.shp",full.names=TRUE), readShapePoly,mc.cores=6)
ndvi <- raster("NDVI.dat")
mc<- function(y) {
temp <- gUnionCascaded(y)
extract <- extract(ndvi,temp)
mean <- range(extract, na.rm=T )[1:2]
leng <- length(output)
}
output <- lapply(shp, mc)

Are there any changes I can make to reduce the memory load? I tried loading fewer shapefiles which worked for about 5 min before the memory spiked again. Its a quad core computer 2.4ghz with 8gb ram


回答1:


I would do this (untested):

## Clearly we need these packages, and their dependencies
library(raster)
library(rgeos)
shpfiles <- list.files(pattern="*.shp",full.names=TRUE)
ndvi <- raster("NDVI.dat")
## initialize an object to store the results for each shpfile
res <- vector("list", length(shpfiles))
names(res) <- shpfiles
## loop over files
for (i in seq_along(shpfiles)) {
  ## do the union
  temp <- gUnionCascaded(shpfiles[i])
  ## extract for this shape data (and don't call it "extract")
  extracted <- extract(ndvi,temp)
  ## further processing, save result
  mean <- range(extracted, na.rm = TRUE )[1:2]
  res[[i]] <- mean  ## plus whatever else you need
}

It's not at all clear what the return value of mc() above is meant to be, so I ignore it. This will be far more memory efficient and fast than what you tried originally. I doubt it's worth using parallel stuff at all here.



来源:https://stackoverflow.com/questions/15694355/extract-from-raster-package-using-excessive-memory

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!