Calling beam.io.WriteToBigQuery in a beam.DoFn
问题 I've created a dataflow template with some parameters. When I write the data to BigQuery, I would like to make use of these parameters to determine which table it is supposed to write to. I've tried calling WriteToBigQuery in a ParDo as suggested in the following link. How can I write to Big Query using a runtime value provider in Apache Beam? The pipeline ran successfully but it is not creating or loading data to BigQuery. Any idea what might be the issue? def run(): pipeline_options =