I receive json-files with data to be analyzed in R, for which I use the RJSONIO-package:
library(RJSONIO)
filename <- \"Indata.json\"
jFile <- fromJSON
Even though the question is very old, this might be of use for someone with a similar problem.
The function jsonlite::stream_in()
allows to define pagesize
to set the number of lines read at a time, and a custom function that is applied to this subset in each iteration can be provided as handler
. This allows working with very large JSON-files without reading everything into memory at the same time.
stream_in(con, pagesize = 5000, handler = function(x){
# Do something with the data here
})