I am trying to serialize a large (~10**6 rows, each with ~20 values) list, to be used later by myself (so pickle\'s lack of safety isn\'t a concern).
Each row of the lis
For hundreds of thousands of simple (up to JSON-compatible) complexity Python objects, I've found the best combination of simplicity, speed, and size by combining:
It beats pickle
and cPickle
options by orders of magnitude.
with gzip.open(filename, 'wb') as f:
ubjson.dump(items, f)
with gzip.open(filename, 'rb') as f:
return ubjson.load(f)