rcurl

415 code using httr and RCurl, but not just curl

笑着哭i 提交于 2019-12-06 05:15:30
I'm trying to write a function that handles some of the authentication for Spotify's API. I can get it to work with a fairly simple curl command, but when I try to use httr or RCurl, I get 415 Unsupported Media Type responses. I'm somewhat at a loss at this point. I've gotten POST() , and GET() to work with this API already, but this endpoint is not working. Using httr : response <- POST('https://accounts.spotify.com/api/token', accept_json(), add_headers('Authorization'=paste('Basic',base64(paste(client_id,':',client_secret)),sep=' ')), body=list(grant_type='client_credentials'), encode='json

Send expression to website return dynamic result (picture)

泪湿孤枕 提交于 2019-12-06 02:09:31
问题 I use http://www.regexper.com to view a picto representation regular expressions a lot. I would like a way to ideally: send a regular expression to the site open the site with that expression displayed For example let's use the regex: "\\s*foo[A-Z]\\d{2,3}" . I'd go tot he site and paste \s*foo[A-Z]\d{2,3} (note the removal of the double slashes). And it returns: I'd like to do this process from within R. Creating a wrapper function like view_regex("\\s*foo[A-Z]\\d{2,3}") and the page (http:/

Downloading a file after login using a https URL

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-05 18:51:31
I am trying to download an excel file, which I have the link to, but I am required to log in to the page before I can download the file. I have successfully passed the login page with rvest, rcurl and httr, but I am having an extremely difficult time downloading the file after I have logged in. url <- "https://website.com/console/login.do" download_url <- "https://website.com/file.xls" session <- html_session(url) form <- html_form(session)[[1]] filled_form <- set_values(form, userid = user, password = pass) ## Save main page url main_page <- submit_form(session, filled_form) download.file

R packages: RCurl and curl packages install failure on Linux

孤街浪徒 提交于 2019-12-05 10:33:47
I hope you can help with this issue I have come across whilst installing RCurl and curl packages for R. Rd warning: /tmp/RtmpOBkvFC/R.INSTALLd07e6c06faf4/RCurl/man/url.exists.Rd:5: missing file link ‘file.exists’ ** building package indices ** testing if installed package can be loaded Error in dyn.load(file, DLLpath = DLLpath, ...) : unable to load shared object '/home/majaidi/R/x86_64-redhat-linux-gnu-library/3.1/RCurl/libs/RCurl.so': /lib64/libgssapi_krb5.so.2: symbol krb5int_buf_len, version krb5support_0_MIT not defined in file libkrb5support.so.0 with link time reference Error: loading

Get response header

本小妞迷上赌 提交于 2019-12-05 08:41:06
I would like to get response headers from GET or POST. My example is: library(httr) library(RCurl) url<-'http://www.omegahat.org/RCurl/philosophy.html' doc<-GET(url) names(doc) [1] "url" "handle" "status_code" "headers" "cookies" "content" "times" "config" but there is no response headers, only request headers. Result shoud be something like this: Connection:Keep-Alive Date:Mon, 11 Feb 2013 20:21:56 GMT ETag:"126a001-e33d-4c12cf2702440" Keep-Alive:timeout=15, max=100 Server:Apache/2.2.14 (Ubuntu) Vary:Accept-Encoding Can I do this with R and httr/RCurl packages or R is not enough for this kind

Login to .NET site using R

柔情痞子 提交于 2019-12-05 05:44:59
I am trying to login with my credentials to a .NET site but unable to get it working. My code is inspired from the below thread How to login and then download a file from aspx web pages with R library(RCurl) curl = getCurlHandle() curlSetOpt(cookiejar = 'cookies.txt', followlocation = TRUE, autoreferer = TRUE, curl = curl) html <- getURL('http://www.aceanalyser.com/Login.aspx', curl = curl) viewstate <- as.character(sub('.*id="__VIEWSTATE" value="([0-9a-zA-Z+/=]*).*', '\\1', html)) viewstategenerator <- as.character(sub('.*id="__VIEWSTATEGENERATOR" value="([0-9a-zA-Z+/=]*).*', '\\1', html))

RCurl and self-signed certificate issues

為{幸葍}努か 提交于 2019-12-05 05:37:08
I am having problems getting RCurl function getURL to access an HTTPS URL on a server that is using a self-signed certificate. I'm running R 3.0.2 on Mac OS X 10.9.2. I have read the FAQ and the curl page on the subject. So this is where I stand: I have saved a copy of the certificate to disk (~/cert.pem). I have been able to use this very same file to connect to the server using python-requests and the 'verify' option, and succeeded. curl on the command-line seems to be ignoring the --cacert option. I succeeded in accessing the website with it after I flagged the certificate as trusted using

log into a website to grab the data using RCurl

和自甴很熟 提交于 2019-12-05 04:43:29
I wanted to login to the website using RCurl and grab the data from the web (The data cannot be seen without logging in.) I wanted to export this (for example) " http://www.appannie.com/app/ios/instagram/ranking/history/chart_data/?s=2010-10-06&e=2012-06-04&c=143441&f=ranks&d=iphone " into R after I log in using RCurl. The issue is I cannot log in using RCurl. I haven't tried this before so mostly I referred to http://www.omegahat.org/RCurl/philosophy.html . So here's what I tried. (here, 'me@gmail.com' is my user ID and '9999' is my Password - i just made it up.) library(RJSONIO) library

How to stop execution of RCurl::getURL() if it is taking too long?

拈花ヽ惹草 提交于 2019-12-05 02:03:45
问题 Is there a way to tell R or the RCurl package to give up on trying to download a webpage if it exceeds a specified period of time and move onto the next line of code? For example: > library(RCurl) > u = "http://photos.prnewswire.com/prnh/20110713/NY34814-b" > getURL(u, followLocation = TRUE) > print("next line") # programme does not get this far This will just hang on my system and not proceed to the final line. EDIT: Based on @Richie Cotton's answer below, while I can 'sort of' achieve what

Upload a file over 2.15 GB in R

人盡茶涼 提交于 2019-12-04 22:13:41
I've got a manual process where I'm uploading 5-6 GB file to a web server via curl: curl -X POST --data-binary @myfile.csv http://myserver::port/path/to/api This process works fine, but I'd love to automate it using R. The problem is, I either don't know what I'm doing, or the R libraries for curl don't know how to handle files bigger than ~2GB: library(RCurl) postForm( "http://myserver::port/path/to/api", file = fileUpload( filename = path.expand("myfile.csv"), contentType = "text/csv" ),.encoding="utf-8") Yeilds Error: Internal Server Error httr doesn't work either: library(httr) POST( url =