How can I serialize a numpy array while preserving matrix dimensions?

后端 未结 7 800
青春惊慌失措
青春惊慌失措 2020-12-04 13:14

numpy.array.tostring doesn\'t seem to preserve information about matrix dimensions (see this question), requiring the user to issue a call to numpy.array.

相关标签:
7条回答
  • 2020-12-04 13:41

    Msgpack has the best serialization performance: http://www.benfrederickson.com/dont-pickle-your-data/

    Use msgpack-numpy. See https://github.com/lebedov/msgpack-numpy

    Install it:

    pip install msgpack-numpy
    

    Then:

    import msgpack
    import msgpack_numpy as m
    import numpy as np
    
    x = np.random.rand(5)
    x_enc = msgpack.packb(x, default=m.encode)
    x_rec = msgpack.unpackb(x_enc, object_hook=m.decode)
    
    0 讨论(0)
  • 2020-12-04 13:43

    Try using numpy.array_repr or numpy.array_str.

    0 讨论(0)
  • 2020-12-04 13:44

    EDIT: As one can read in the comments of the question this solution deals with "normal" numpy arrays (floats, ints, bools ...) and not with multi-type structured arrays.

    Solution for serializing a numpy array of any dimensions and data types

    As far as I know you can not simply serialize a numpy array with any data type and any dimension...but you can store its data type, dimension and information in a list representation and then serialize it using JSON.

    Imports needed:

    import json
    import base64
    

    For encoding you could use (nparray is some numpy array of any data type and any dimensionality):

    json.dumps([str(nparray.dtype), base64.b64encode(nparray), nparray.shape])
    

    After this you get a JSON dump (string) of your data, containing a list representation of its data type and shape as well as the arrays data/contents base64-encoded.

    And for decoding this does the work (encStr is the encoded JSON string, loaded from somewhere):

    # get the encoded json dump
    enc = json.loads(encStr)
    
    # build the numpy data type
    dataType = numpy.dtype(enc[0])
    
    # decode the base64 encoded numpy array data and create a new numpy array with this data & type
    dataArray = numpy.frombuffer(base64.decodestring(enc[1]), dataType)
    
    # if the array had more than one data set it has to be reshaped
    if len(enc) > 2:
         dataArray.reshape(enc[2])   # return the reshaped numpy array containing several data sets
    

    JSON dumps are efficient and cross-compatible for many reasons but just taking JSON leads to unexpected results if you want to store and load numpy arrays of any type and any dimension.

    This solution stores and loads numpy arrays regardless of the type or dimension and also restores it correctly (data type, dimension, ...)

    I tried several solutions myself months ago and this was the only efficient, versatile solution I came across.

    0 讨论(0)
  • 2020-12-04 13:50

    Try traitschema https://traitschema.readthedocs.io/en/latest/

    "Create serializable, type-checked schema using traits and Numpy. A typical use case involves saving several Numpy arrays of varying shape and type."

    0 讨论(0)
  • 2020-12-04 13:51

    I found the code in Msgpack-numpy helpful. https://github.com/lebedov/msgpack-numpy/blob/master/msgpack_numpy.py

    I modified the serialised dict slightly and added base64 encoding to reduce the serialised size.

    By using the same interface as json (providing load(s),dump(s)), you can provide a drop-in replacement for json serialisation.

    This same logic can be extended to add any automatic non-trivial serialisation, such as datetime objects.


    EDIT I've written a generic, modular, parser that does this and more. https://github.com/someones/jaweson


    My code is as follows:

    np_json.py

    from json import *
    import json
    import numpy as np
    import base64
    
    def to_json(obj):
        if isinstance(obj, (np.ndarray, np.generic)):
            if isinstance(obj, np.ndarray):
                return {
                    '__ndarray__': base64.b64encode(obj.tostring()),
                    'dtype': obj.dtype.str,
                    'shape': obj.shape,
                }
            elif isinstance(obj, (np.bool_, np.number)):
                return {
                    '__npgeneric__': base64.b64encode(obj.tostring()),
                    'dtype': obj.dtype.str,
                }
        if isinstance(obj, set):
            return {'__set__': list(obj)}
        if isinstance(obj, tuple):
            return {'__tuple__': list(obj)}
        if isinstance(obj, complex):
            return {'__complex__': obj.__repr__()}
    
        # Let the base class default method raise the TypeError
        raise TypeError('Unable to serialise object of type {}'.format(type(obj)))
    
    
    def from_json(obj):
        # check for numpy
        if isinstance(obj, dict):
            if '__ndarray__' in obj:
                return np.fromstring(
                    base64.b64decode(obj['__ndarray__']),
                    dtype=np.dtype(obj['dtype'])
                ).reshape(obj['shape'])
            if '__npgeneric__' in obj:
                return np.fromstring(
                    base64.b64decode(obj['__npgeneric__']),
                    dtype=np.dtype(obj['dtype'])
                )[0]
            if '__set__' in obj:
                return set(obj['__set__'])
            if '__tuple__' in obj:
                return tuple(obj['__tuple__'])
            if '__complex__' in obj:
                return complex(obj['__complex__'])
    
        return obj
    
    # over-write the load(s)/dump(s) functions
    def load(*args, **kwargs):
        kwargs['object_hook'] = from_json
        return json.load(*args, **kwargs)
    
    
    def loads(*args, **kwargs):
        kwargs['object_hook'] = from_json
        return json.loads(*args, **kwargs)
    
    
    def dump(*args, **kwargs):
        kwargs['default'] = to_json
        return json.dump(*args, **kwargs)
    
    
    def dumps(*args, **kwargs):
        kwargs['default'] = to_json
        return json.dumps(*args, **kwargs)
    

    You should be able to then do the following:

    import numpy as np
    import np_json as json
    np_data = np.zeros((10,10), dtype=np.float32)
    new_data = json.loads(json.dumps(np_data))
    assert (np_data == new_data).all()
    
    0 讨论(0)
  • 2020-12-04 13:56

    If it needs to be human readable and you know that this is a numpy array:

    import numpy as np; 
    import json;
    
    a = np.random.normal(size=(50,120,150))
    a_reconstructed = np.asarray(json.loads(json.dumps(a.tolist())))
    print np.allclose(a,a_reconstructed)
    print (a==a_reconstructed).all()
    

    Maybe not the most efficient as the array sizes grow larger, but works for smaller arrays.

    0 讨论(0)
提交回复
热议问题