httr GET function running out of space when downloading a large file

放肆的年华 提交于 2019-12-10 13:07:48

问题


i'm trying to download a file that's 1.1 gigabytes with httr but i'm hitting the following error:

x <- GET( extract.path )
Error in curlPerform(curl = handle$handle, .opts = curl_opts$values) : 
  cannot allocate more space: 1728053248 bytes

my C drive has 400GB free..

in the RCurl package, i see the maxfilesize and maxfilesize.large options when using getCurlOptionsConstants() but i don't understand if/how these might be passed to httr through config or set_config.. or if i need to switch over to RCurl for this.. and even if i do need to switch, will increasing the maximum filesize work?

here's my sessionInfo..

> sessionInfo()
R version 3.0.0 (2013-04-03)
Platform: i386-w64-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United States.1252    LC_MONETARY=English_United States.1252 LC_NUMERIC=C                           LC_TIME=English_United States.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] XML_3.96-1.1 httr_0.2    

loaded via a namespace (and not attached):
[1] digest_0.6.0   RCurl_1.95-4.1 stringr_0.6.2  tools_3.0.0   

..and (this is not recommended, just because it will take you a while) if you want to reproduce my error, you can go to https://usa.ipums.org/usa-action/samples, register for a new account, choose the 2011 5-year acs extract, add about a hundred variables, and then wait for the extract to be ready. then edit the first three lines and run the code below. (again, not recommended)

your.email <- "email@address.com"
your.password <- "password"
extract.path <- "https://usa.ipums.org/usa-action/downloads/extract_files/some_file.csv.gz"

require(httr)

values <- 
    list(
        "login[email]" = your.email , 
        "login[password]" = your.password , 
        "login[is_for_login]" = 1
    )

POST( "https://usa.ipums.org/usa-action/users/validate_login" , body = values )
GET( "https://usa.ipums.org/usa-action/extract_requests/download" , query = values )

# this line breaks
x <- GET( extract.path )

回答1:


FYI - this has been added in the write_disk() control in httr: https://github.com/hadley/httr/blob/master/man/write_disk.Rd




回答2:


GET calls httr:::make_request this sets the curl options defined in config = list(). However it appears the writefunction otpion is hard coded in 'httr'

opts$writefunction <- getNativeSymbolInfo("R_curl_write_binary_data")$address

You will probably need to use RCurl and define an appropriate `writefunction'. The following solution Create a C-level file handle in RCurl for writing downloaded files from @Martin Morgan appears to be the way to go.



来源:https://stackoverflow.com/questions/17306695/httr-get-function-running-out-of-space-when-downloading-a-large-file

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!