Airflow - run task regardless of upstream success/fail

前端 未结 1 383
眼角桃花
眼角桃花 2021-01-03 21:53

I have a DAG which fans out to multiple independent units in parallel. This runs in AWS, so we have tasks which scale our AutoScalingGroup up to the maximum number of worker

相关标签:
1条回答
  • 2021-01-03 22:27

    All operators have an argument trigger_rule which can be set to 'all_done', which will trigger that task regardless of the failure or success of the previous task(s).

    You could set the trigger rule for the task you want to run to 'all_done' instead of the default 'all_success'.

    A simple bash operator task with that argument would look like:

    task = BashOperator(
        task_id="hello_world",
        bash_command="echo Hello World!",
        trigger_rule="all_done",
        dag=dag
        )
    
    0 讨论(0)
提交回复
热议问题