Is there a standard mapping between JSON and Protocol Buffers?

后端 未结 6 1382
后悔当初
后悔当初 2020-12-13 19:02

From a comment on the announcement blog post:

Regarding JSON: JSON is structured similarly to Protocol Buffers, but protocol buffer binary format

6条回答
  •  醉梦人生
    2020-12-13 19:43

    First of all I think one should reason very carefully on putting an effort into converting a dataset to protobuffs. Here my reasons to convert a data-set to protobuffs

    1. Type Safety: guarantee on the format of the data being considered.
    2. uncompressed memory foot-print of the data. The reason I mention un-compressed is because post compression there isn't much of a difference in the size of JSON compressed and proto compressed but compression has a cost associated with it. Also, the serialization/de-serialization speed is almost similar, infact Jackson json is faster than protobuffs. Please checkout the following link for more information http://technicalrex.com/2014/06/23/performance-playground-jackson-vs-protocol-buffers/
    3. The protobuffs needs to be transferred over the network a lot.

    Saying that once you convert your data-set to Jackson JSON format in the way that the ProtoBuff definition is defined then it can very easily be directly mapped to ProtoBuff format using the Protostuff:JsonIoUtil:mergeFrom function. Signature of the function :

    public static  void mergeFrom(JsonParser parser, T message, Schema schema,  boolean numeric) 
    

    Reference to protostuff

提交回复
热议问题