shutdown and update job in Google Dataflow with PubSubIO + message guarantees

后端 未结 1 1493
不知归路
不知归路 2021-01-14 23:15

I have been looking through the source and documentation for google dataflow and I didn\'t see any mention of the message delivery semantics around PubSubIO.Read

相关标签:
1条回答
  • 2021-01-15 00:03

    Dataflow doesn't ack messages to Pub/Sub until they have been persisted in intermediate storage within the pipeline (or emitted to the sink, if there is no GroupByKey within the pipeline). We also do deduping of messages read from Pub/Sub for a short period to prevent duplicate delivery from missed acks. So Dataflow guarantees exactly once delivery, modulo any duplicates inserted by publishers at drastically different times.

    Any intermediate state buffered within a running pipeline is maintained when the pipeline is updated. Streaming pipelines do not fail -- instead they continue to retry elements with errors. Either the error is transient and the element will eventually be processed successfully, or in the case of a consistent exception (NullPointerException in your code, etc) you can update the job with corrected code that will be used to process the failing element.

    (Note that the implementation is different for the DirectRunner, which may be causing confusion if you are looking at that part of the code.)

    0 讨论(0)
提交回复
热议问题