Split huge (95Mb) JSON array into smaller chunks?

后端 未结 4 1122
不知归路
不知归路 2021-01-18 06:04

I exported some data from my database in the form of JSON, which is essentially just one [list] with a bunch (900K) of {objects} inside it.

Trying to import it on my

4条回答
  •  不思量自难忘°
    2021-01-18 06:32

    In Python:

    import json
    with open('file.json') as infile:
      o = json.load(infile)
      chunkSize = 1000
      for i in xrange(0, len(o), chunkSize):
        with open('file_' + str(i//chunkSize) + '.json', 'w') as outfile:
          json.dump(o[i:i+chunkSize], outfile)
    

提交回复
热议问题