Is there a standard mapping between JSON and Protocol Buffers?

后端 未结 6 1383
后悔当初
后悔当初 2020-12-13 19:02

From a comment on the announcement blog post:

Regarding JSON: JSON is structured similarly to Protocol Buffers, but protocol buffer binary format

相关标签:
6条回答
  • 2020-12-13 19:19

    One further thought: if protobuf objects have getters/setters, or appropriately named fields, one could simply use Jackson JSON processor's data binding. By default it handles public getters, any setters and public fields, but these are just default visibility levels and can be changed. If so, Jackson can serialize/deserialize protobuf generated POJOs without problems.

    I have actually used this approach with Thrift-generated objects; the only thing I had to configure there was to disable serialization of various "isXXX()" methods that Thrift adds for checking if a field has been explicitly assigned or not.

    0 讨论(0)
  • 2020-12-13 19:21

    I needed to marshal from GeneratedMessageLite to a JSON object but did not need to unmarshal. I couldn't use the protobuf library in Pangea's answer because it doesn't work with the LITE_RUNTIME option. I also didn't want to burden our already large legacy system with generating more compiled code for the existing protocol buffers. For mashalling to JSON, I went with this simple solution to marshal

        final Person gpb = Person.newBuilder().setName("Bill Monroe").build();
        final Gson gson = new Gson();
        final String jsonString = gson.toJson(gpb);
    
    0 讨论(0)
  • 2020-12-13 19:23

    Yes, since Protocol Buffers version 3.0.0 (released July 28, 2016) there is "A well-defined encoding in JSON as an alternative to binary proto encoding" as mentioned in the release notes

    https://github.com/google/protobuf/releases/tag/v3.0.0

    0 讨论(0)
  • 2020-12-13 19:30

    From what I have seen, Protostuff is the project to use for any PB work on Java, including serializing it as JSON, based on protocol definition. I have not used it myself, just heard good things.

    0 讨论(0)
  • 2020-12-13 19:31

    May be this is helpful http://code.google.com/p/protobuf-java-format/

    0 讨论(0)
  • 2020-12-13 19:43

    First of all I think one should reason very carefully on putting an effort into converting a dataset to protobuffs. Here my reasons to convert a data-set to protobuffs

    1. Type Safety: guarantee on the format of the data being considered.
    2. uncompressed memory foot-print of the data. The reason I mention un-compressed is because post compression there isn't much of a difference in the size of JSON compressed and proto compressed but compression has a cost associated with it. Also, the serialization/de-serialization speed is almost similar, infact Jackson json is faster than protobuffs. Please checkout the following link for more information http://technicalrex.com/2014/06/23/performance-playground-jackson-vs-protocol-buffers/
    3. The protobuffs needs to be transferred over the network a lot.

    Saying that once you convert your data-set to Jackson JSON format in the way that the ProtoBuff definition is defined then it can very easily be directly mapped to ProtoBuff format using the Protostuff:JsonIoUtil:mergeFrom function. Signature of the function :

    public static <T> void mergeFrom(JsonParser parser, T message, Schema<T> schema,  boolean numeric) 
    

    Reference to protostuff

    0 讨论(0)
提交回复
热议问题