Is there a common format for neural networks

前端 未结 2 1913
生来不讨喜
生来不讨喜 2021-01-17 10:38

Different teams use different libraries to train and run neural networks (caffe, torch, theano...). This makes sharing difficult: each library has its own format to store ne

2条回答
  •  不思量自难忘°
    2021-01-17 11:29

    Is there a preferred (shared?) format to store neural networks?

    Each library / framework has its own serialization, e.g. Caffe uses Protocol Buffers, Torch has a built-in serialization scheme and Theano objects can be serialized with pickle.

    In some cases like OverFeat or darknet the weights and biases are stored on-disk in binary format via plain fwrite-s of the corresponding float(or double) contiguous arrays (see this answer for more details). Note that this does not cover the architecture of the network / model which has to be known or represented separately (like declared explicitly at load time).

    Also: a library like libccv stores the structure and the weights in a SQLite database.

    Is there a service or library that can help handle different types of networks / or transform one type into another?

    I don't think there is a single (meta) library that claims to do so. But it exists distinct projects that provide convenient converters.

    Some examples (non exhaustive):

    • Caffe -> Torch: https://github.com/szagoruyko/loadcaffe
    • Torch -> Caffe: https://github.com/facebook/fb-caffe-exts
    • Caffe -> TensorFlow: https://github.com/ethereon/caffe-tensorflow

    --

    UPDATE (2017-09): two noticeable initiatives are:

    (1) the ONNX format (a.k.a. Open Neural Network Exchange):

    [...] a standard for representing deep learning models that enables models to be transferred between frameworks

    See these blog posts.

    (2) the CoreML format introduced by Apple:

    [...] a public file format (.mlmodel) for a broad set of ML methods [...] Models in this format can be directly integrated into apps through Xcode.

提交回复
热议问题