jsonstream

JSON object creation PushStreamContent

馋奶兔 提交于 2021-02-07 09:45:46
问题 I have asp.net web api and has a HTTPResponseMessage and the api method name GetPersonDataStream , which actually stream each person object as a json. So when I see the result the actual Data has been constructed like two seperate object's with no comma in between the two objects are it isn't constructed as I required. Actual streamed data : {"Name":"Ram","Age":30}{"Name":"Sam","Age":32} . But I want this to streamed as a proper JSON as: {"response": [ {"Name":"Ram","Age":30}, {"Name":"Sam",

JSON object creation PushStreamContent

旧城冷巷雨未停 提交于 2021-02-07 09:45:36
问题 I have asp.net web api and has a HTTPResponseMessage and the api method name GetPersonDataStream , which actually stream each person object as a json. So when I see the result the actual Data has been constructed like two seperate object's with no comma in between the two objects are it isn't constructed as I required. Actual streamed data : {"Name":"Ram","Age":30}{"Name":"Sam","Age":32} . But I want this to streamed as a proper JSON as: {"response": [ {"Name":"Ram","Age":30}, {"Name":"Sam",

How to read stream of JSON objects per object

非 Y 不嫁゛ 提交于 2020-01-24 15:48:08
问题 I have a binary application which generates a continuous stream of json objects (not an array of json objects). Json object can sometimes span multiple lines (still being a valid json object but prettified). I can connect to this stream and read it without problems like: var child = require('child_process').spawn('binary', ['arg','arg']); child.stdout.on('data', data => { console.log(data); }); Streams are buffers and emit data events whenever they please, therefore I played with readline

How can I parse a large JSON file with repeating values in JavaScript?

我的梦境 提交于 2020-01-05 04:37:17
问题 I am parsing a large JSON file using JSON stream. This works but it returns the file line by line. So when I try and restructure the data I can only get the data that is not repeating. For example, this is the structure: { "Test": { "id": 3454534344334554345434, "details": { "text": "78679786787" }, "content": { "text": 567566767656776 }, "content": { "text": 567566767656776 }, "content": { "text": 567566767656776 } } } I'm able to get Test.id or Test.details.id but I can only get the First

JQ How to merge multiple objects into one

血红的双手。 提交于 2020-01-02 19:06:19
问题 Given the following input (which is a toned down version of the output with 100K+ objects of another complex query): echo '{ "a": { "b":"c", "d":"e" } }{ "a": { "b":"f", "d":"g" } }' | jq '.' { "a": { "b": "c", "d": "e" } } { "a": { "b": "f", "d": "g" } } desired output: { "c": "e", "f": "g" } or (suits better for follow up usage): { x: { "c": "e", "f": "g" } } I can't for the life of me figure out how to do it. My real problem of course is the multiple object input data, for which I really

How to parse an infinite json array from stdin in go?

烂漫一生 提交于 2019-12-24 03:21:19
问题 I'm trying to write a small replacement for i3status, a small programm that comunicates with i3bar conforming this protocol. They exchange messeages via stdin and stdout. The stream in both directions is an infinite array of json objects. The start of the stream from i3bar to i3status (which i want to replace) looks like this: [ {"name": "some_name_1","instance": "some_inst_1","button": 1,"x": 213,"y": 35} ,{"name": "some_name_1","instance": "some_inst_2","button": 2,"x": 687,"y": 354} ,{

How to parse an infinite json array from stdin in go?

99封情书 提交于 2019-12-24 03:20:18
问题 I'm trying to write a small replacement for i3status, a small programm that comunicates with i3bar conforming this protocol. They exchange messeages via stdin and stdout. The stream in both directions is an infinite array of json objects. The start of the stream from i3bar to i3status (which i want to replace) looks like this: [ {"name": "some_name_1","instance": "some_inst_1","button": 1,"x": 213,"y": 35} ,{"name": "some_name_1","instance": "some_inst_2","button": 2,"x": 687,"y": 354} ,{

node heap exhausted when piping JSONStream.parsed() data through es.map() and JSONStream.stringify() to file stream

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-14 02:05:49
问题 I'm trying to pipe an input stream (created from a huge GeoJSON file) through JSONStream.parse() to break the stream into objects, then through event-stream.map() to allow me to transform the object, then through JSONStream.stringify() to create a string out of it, and finally to a writable output stream. As the process runs, I can see node's memory footprint continue to grow until it eventually exhausts heap. Here's the simplest script (test.js) that recreates the problem: const fs = require