问题
Is it possible that processes spawned by RServe share some common libraries loaded once into memory? Imagine that I need to execute bellow code on 100 different RConnections concurrently.
library(libraryOfSize40MB)
fun()
It means that I need about 3.9GB of memory just to load library. I would prefer to load library once and then execute fun()
one hundred times, so that I can run this on cheap host.
Maybe this is helpful? https://github.com/s-u/Rserve/blob/master/NEWS#L40-L48
回答1:
It is possible. You have to run RServe from R shell using run.serve
preceded by loaded libraries:
library(Rserve)
#load libraries so all connections will share them
library("yaml")
library("reshape")
library("rjson")
library("zoo")
(...)
library("stringr")
run.Rserve(debug = TRUE, port = 6311, remote=TRUE, auth=FALSE, args="--no-save", config.file = "/etc/Rserve.conf")
Every new connection will be able to see this libraries
library(RSclient)
con = RS.connect(host='10.1.2.3')
RS.eval(con, quote(search()))
> #lots of libraries available
来源:https://stackoverflow.com/questions/31433840/rserve-share-library-code