问题
Spark streaming provides API for termination awaitTermination(). Is there any similar API available to gracefully shut down flink streaming after some t seconds?
回答1:
Your driver program (i.e. the main
method) in Flink doesn't stay running while the streaming job executes. Your program should define a dataflow, call execute
, and then terminate. In Spark, the driver program stays running (AFAIK), and awaitTermination
relates to that.
Note that a Flink streaming dataflow continues to execute indefinitely, unless you're using a 'bounded' data source with a finite number of elements. You may also cancel or stop a job, and even take a checkpoint upon stopping to be resumed from later.
来源:https://stackoverflow.com/questions/45330102/flink-streaming-how-to-control-the-execution-time