问题
Is there a way to pass a parameter to:
airflow trigger_dag dag_name {param}
?
I have a script that monitors a directory for files - when a file gets moves into the target directory I want to trigger the dag passing as a parameter the file path.
回答1:
you can pass it like this:
airflow trigger_dag --conf {"file_variable": "/path/to/file"} dag_id
Then in your dag, you can access this variable using templating as follows:
{{ dag_run.conf.file_variable }}
If this doesn't work, sharing a simple version of your dag might help in getting better answers.
回答2:
yes you can. Your Dag should have a Dag and a Bask Task like this:
from airflow.operators.bash_operators import BashOperator
args = {'start_date':datetime.now(),
'owner':'airflow',}
dag = DAG(
dag_id='param_dag',
default_args=args,
schedule_interval=None)
bash_task=BashOperator(
task_id="bash_task"
bash_command= 'bash ~/path/bashscript.sh {{ dag_run.conf["parameter"] if dag_run else "" }} ',
//bashscript your script you want to run and the dag_run.conf will hold the parameter you want to pass
dag=dag)
Now on your command line just type the command:
airflow trigger_dag dag_id --conf '{"parameter":"~/path" }'
来源:https://stackoverflow.com/questions/44363243/airflow-pass-parameter-from-cli