I am thinking about building a RESTful API using the programming language R, mainly to expose my machine learning model to the user in an API format. I know there are some options like export to PMML, PFA and use other languages to take care of the API part. However, I want to stick to the same programming language and was wondering if there is anything like Flask/Django/Springbook framework in R?
I took a look at servr/shiny but I really don't think RESTful is what they are designed for. Is there any better solution within R that is more easy to use?
I have two options for you:
plumber
plumber allows you to create a REST API by decorating your existing R source code with special comments.
A small example file:
# myfile.R
#* @get /mean
normalMean <- function(samples=10){
data <- rnorm(samples)
mean(data)
}
#* @post /sum
addTwo <- function(a, b){
as.numeric(a) + as.numeric(b)
}
From the R command line:
> library(plumber)
> r <- plumb("myfile.R") # Where 'myfile.R' is the location of the file shown above
> r$run(port=8000)
With this you would get results like this:
$ curl "http://localhost:8000/mean"
[-0.254]
$ curl "http://localhost:8000/mean?samples=10000"
[-0.0038]
Jug
Jug is a small web development framework for R which relies heavily upon the httpuv package. It’s main focus is to make building APIs for your code as easy as possible. It is not supposed to be either an especially performant nor an uber stable web framework. Other tools (and languages) might be more suited for that. It’s main focus is to easily allow you to create APIs for your R code. However, the flexibility of Jug means that, in theory, you could built an extensive web framework with it.
It very easy to learn and has a nice vignette.
An Hello-World-example:
library(jug)
jug() %>%
get("/", function(req, res, err){
"Hello World!"
}) %>%
simple_error_handler_json() %>%
serve_it()
This is for those who want to have a comparion of API development with R - plumber, Rserve and rApache.
Basically concurrent requests are queued by httpuv
in plumber
so that it is not performant by itself. The author recommends multiple docker containers but it can be complicated as well as response-demanding.
There are other tech eg Rserve
and rApache
. Rserve
forks prosesses and it is possible to configure rApache
to pre-fork so as to handle concurrent requests.
See the following posts for comparison
https://www.linkedin.com/pulse/api-development-r-part-i-jaehyeon-kim/ https://www.linkedin.com/pulse/api-development-r-part-ii-jaehyeon-kim/
Adding opencpu to this list of the answers:
Do check out OpenCPU by Jeroen Ooms.
Benefits:
Simple and straightforward: Any R package installed on the opencpu server is callable via http.
Just focus on creating the R package and opencpu will take care of the rest.
You can return a relational table of results, a plot, single value or even a pointer (aka a temporary session key) to an R object [imagine a huge object/dataset that you can process/manipulate from other more limited platform ;) ]
CI/CD with your package hosted on Github.
If you are using the server version, opencpu is concurrent and async by design through leveraging of Nginx for caching and load balancing.
Use AppArmor to enforce security on Ubuntu. Or if you use fedora, you can set up public-private certificate authentication, thanks to Apache server at the backend. Thanks to rApache!
The above are too complicated: You can also start a single user session on your local machine using
opencpu::ocpu_start_app()
and serve your functions (downside is security)Need an user interface? Simply create an UI using javascript, store it in the www folder of the R package and user can open it on their web browser and use your functions.
This post doesn't do opencpu justice. I would really recommend that you read his links at the top of OpenCPU
Have a playaround with https://cloud.opencpu.org/ocpu/test or https://www.opencpu.org/apps.html
来源:https://stackoverflow.com/questions/38079580/building-restful-api-using-r