jsonstream

Dealing with a JSON object too big to fit into memory

一世执手 提交于 2019-12-08 00:57:28
问题 I have a dump of a Firebase database representing our Users table stored in JSON. I want to run some data analysis on it but the issue is that it's too big to load into memory completely and manipulate with pure JavaScript (or _ and similar libraries). Up until now I've been using the JSONStream package to deal with my data in bite-sized chunks (it calls a callback once for each user in the JSON dump). I've now hit a roadblock though because I want to filter my user ids based on their value.

How to use streams to JSON stringify large nested objects in Node.js?

狂风中的少年 提交于 2019-12-07 02:22:17
问题 I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so var fs = require('fs'); var JSONStream = require('JSONStream'); var st = JSONStream.stringifyObject() .pipe(fs.createWriteStream('./output_file.js')) st.write(large_object); When I try this I get an error: stream.js:94 throw er; // Unhandled stream error in pipe. ^ TypeError: Invalid non-string/buffer chunk at validChunk (_stream_writable.js:153:14) at

Dealing with a JSON object too big to fit into memory

孤人 提交于 2019-12-06 14:22:01
I have a dump of a Firebase database representing our Users table stored in JSON. I want to run some data analysis on it but the issue is that it's too big to load into memory completely and manipulate with pure JavaScript (or _ and similar libraries). Up until now I've been using the JSONStream package to deal with my data in bite-sized chunks (it calls a callback once for each user in the JSON dump). I've now hit a roadblock though because I want to filter my user ids based on their value. The "questions" I'm trying to answer are of the form "Which users x" whereas previously I was just

JQ How to merge multiple objects into one

独自空忆成欢 提交于 2019-12-06 07:23:50
Given the following input (which is a toned down version of the output with 100K+ objects of another complex query): echo '{ "a": { "b":"c", "d":"e" } }{ "a": { "b":"f", "d":"g" } }' | jq '.' { "a": { "b": "c", "d": "e" } } { "a": { "b": "f", "d": "g" } } desired output: { "c": "e", "f": "g" } or (suits better for follow up usage): { x: { "c": "e", "f": "g" } } I can't for the life of me figure out how to do it. My real problem of course is the multiple object input data, for which I really don't know whether it's valid JSON. Jq produces and accepts it, jshon does not. I tried various

How to use streams to JSON stringify large nested objects in Node.js?

懵懂的女人 提交于 2019-12-05 07:54:53
I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so var fs = require('fs'); var JSONStream = require('JSONStream'); var st = JSONStream.stringifyObject() .pipe(fs.createWriteStream('./output_file.js')) st.write(large_object); When I try this I get an error: stream.js:94 throw er; // Unhandled stream error in pipe. ^ TypeError: Invalid non-string/buffer chunk at validChunk (_stream_writable.js:153:14) at WriteStream.Writable.write (_stream_writable.js:182:12) So apparently I cant just write an object to this