yajl

YAJL memory leak problem array

余生颓废 提交于 2020-02-07 06:46:31
问题 I am trying to fetch JSON data every 2 seconds and pass it to another class for processing, everything works fine but the below code seems to have memory leaks (from Instruments) but I cannot figure out what is wrong and how I can fix, can someone please advise ??? * Updated with the full logic and it looks like the array that is passed on to the main method is leaking and Instruments is falsely reporting that as YAJL leak..(not very sure thou) * @property (nonatomic,retain,readwrite)

error when importing ijson module python

柔情痞子 提交于 2019-12-24 06:39:49
问题 I need to parse some large (2 Gb+) files into python. I have tried it with the json module but I get a memory error as its methods all load the files at once. I then moved on into installing ijson which suposedly implements a iterator-based way of parsing the file. However when I run: import ijson I get exception : YAJL shared object not found. Has anyone found a similar issue? any help would be greatly appreciated Regards 回答1: Thats an easy one, that is because you haven't installed the YAJL

Parse large JSON hash with ruby-yajl?

家住魔仙堡 提交于 2019-12-18 13:26:36
问题 I have a large file (>50Mb) which contains a JSON hash. Something like: { "obj1": { "key1": "val1", "key2": "val2" }, "obj2": { "key1": "val1", "key2": "val2" } ... } Rather than parsing the entire file and taking say the first ten elements, I'd like to parse each item in the hash. I actually don't care about the key, i.e. obj1 . If I convert the above to this: { "key1": "val1", "key2": "val2" } "obj2": { "key1": "val1", "key2": "val2" } I can easily achieve what I want using Yajl streaming:

what is the hash structure to produce json?

↘锁芯ラ 提交于 2019-12-13 18:08:09
问题 Below is a sample of final json I pass to javascript. I will use the (yajl) ruby library to create this json from a hash. The question is what should a ruby hash that produces the json below look like? var data = [{ data: "basics", attr: {}, children: [ {data: "login", attr: {run: "run"}, children: [ {data: "login", attr: {}} ] } , {data: "Academic Year", attr: {run: "run"}, children: [ {data: "login", attr: {}}, {data: "Academic Year", attr: {filter: "mini", SOF: "yes"}} ] } ] }]; 回答1: Your

Parse complex JSON sub objects in C with YAJL

試著忘記壹切 提交于 2019-12-10 21:11:51
问题 I have YAJL parsing me simple elements like given in the included example without a problem. (strings, integers, arrays, ...) The example code can be found here: http://lloyd.github.io/yajl/yajl-2.0.1/example_2parse_config_8c-example.html but now I have this type of JSON object: { "cmd":2, "properties": [ { "idx":40, "val":8813.602692 }, { "idx":41, "val":960 }, { "idx":42, "val":2 }, { "idx":48, "val":9 } ] } I can retrieve the command with (see the definitions of used variables in the

Encoding custom classes using yajl-objc

一世执手 提交于 2019-12-09 21:24:06
问题 Summary. Based on some benchmarks, I chose yajl-objc for my iPhone JSON parser. I was testing it with an arbitrary custom class (one NSNumber and two NSString properties). If I created an NSDictionary with key-value pairs matching the class properties, I could encode the dictionary with [dictionary yajl_JSON] . When I tried directly encoding an instance of the custom class with [custom yajl_JSON] , I got this compiler error: Terminating app due to uncaught exception

How can I process huge JSON files as streams in Ruby, without consuming all memory?

℡╲_俬逩灬. 提交于 2019-12-06 21:47:23
问题 I'm having trouble processing a huge JSON file in Ruby. What I'm looking for is a way to process it entry-by-entry without keeping too much data in memory. I thought that yajl-ruby gem would do the work but it consumes all my memory. I've also looked at Yajl::FFI and JSON:Stream gems but there it is clearly stated: For larger documents we can use an IO object to stream it into the parser. We still need room for the parsed object, but the document itself is never fully read into memory. Here's

How can I process huge JSON files as streams in Ruby, without consuming all memory?

a 夏天 提交于 2019-12-05 03:49:44
I'm having trouble processing a huge JSON file in Ruby. What I'm looking for is a way to process it entry-by-entry without keeping too much data in memory. I thought that yajl-ruby gem would do the work but it consumes all my memory. I've also looked at Yajl::FFI and JSON:Stream gems but there it is clearly stated: For larger documents we can use an IO object to stream it into the parser. We still need room for the parsed object, but the document itself is never fully read into memory. Here's what I've done with Yajl: file_stream = File.open(file, "r") json = Yajl::Parser.parse(file_stream)

Encoding custom classes using yajl-objc

不问归期 提交于 2019-12-04 18:12:11
Summary. Based on some benchmarks , I chose yajl-objc for my iPhone JSON parser. I was testing it with an arbitrary custom class (one NSNumber and two NSString properties). If I created an NSDictionary with key-value pairs matching the class properties, I could encode the dictionary with [dictionary yajl_JSON] . When I tried directly encoding an instance of the custom class with [custom yajl_JSON] , I got this compiler error: Terminating app due to uncaught exception 'YAJLParsingUnsupportedException', reason: 'Object of type (Custom) must implement dataUsingEncoding: to be parsed' . I got the

Parse large JSON hash with ruby-yajl?

邮差的信 提交于 2019-11-30 09:41:14
I have a large file (>50Mb) which contains a JSON hash. Something like: { "obj1": { "key1": "val1", "key2": "val2" }, "obj2": { "key1": "val1", "key2": "val2" } ... } Rather than parsing the entire file and taking say the first ten elements, I'd like to parse each item in the hash. I actually don't care about the key, i.e. obj1 . If I convert the above to this: { "key1": "val1", "key2": "val2" } "obj2": { "key1": "val1", "key2": "val2" } I can easily achieve what I want using Yajl streaming: io = File.open(path_to_file) count = 10 Yajl::Parser.parse(io) do |obj| puts "Parsed: #{obj}" count -=