问题
As far as I know, we can define AVRO schemas on Kafka and the topic defined with this schema will only accept the data matching with that schema. It's really useful to validate data structure before accepting into the queue.
Is there anything similar in Google Pub/Sub?
回答1:
Kafka itself is not validating a schema, and topics therefore do not inherently have schemas other than a pair of byte arrays plus some metadata. It's the serializer that's part of the producing client that performs the validation before the data reaches the topic. Similarly in PubSub, at the end of the day, it is only storing/sending byte[]
data.
Therefore, in theory, it's completely feasible to use something similar to the Confluent Avro Schema Registry on either end of the data that moves through PubSub. Google does not offer such a feature, AFAIK, so you would need to recreate said service that can perform your Avro compatibility checks, plus tie a PubSub serialization+producer client around that service's client.
Might want to check out Avro message for Google Cloud Pub-Sub?
来源:https://stackoverflow.com/questions/53200289/is-it-possible-to-define-a-schema-for-google-pub-sub-topics-like-in-kafka-with-a