I am doing a project where I need to parse JSON from a URL through HttpClient
. My code works fine for JSON object responses with a small amount of data. But when I
Big amounts of data of JSON must be cut to smaller pieces. For example you have a 50000 products on your database. Then it's wise to paginate API requests - get this huge amount of products by 100-500 records on one query. That will solve your problem.
This approach solves one problem more - errors caused by internet and gprs connection loss etc.
If API is yours then you can change this. If not, then this is a big failure of API design and you can send change request.
EDIT:
Did a little reading and found that highly recommended for parsing huge load of JSON data is http://jackson.codehaus.org/ (Jackson Processor). Haven't tried it, so cannot comment about this library. Also recommend you to save this JSON stream into the file (don't load it to memory) and then parse it by chunks.