问题
I'm making dataflow pipeline with python.
I want to share global variables across pipeline transform and across worker nodes like global variables (across multiple workers).
Is there any way to support this?
thanx in advance
回答1:
Stateful processing may be of use for sharing state between workers of a specific node (would not be able to share between transforms though): https://beam.apache.org/blog/2017/02/13/stateful-processing.html
来源:https://stackoverflow.com/questions/44432556/is-there-anyway-to-share-stateful-variables-in-dataflow-pipeline