Airflow using template files for PythonOperator

后端 未结 4 1129
遥遥无期
遥遥无期 2021-01-31 04:49

The method of getting a BashOperator or SqlOperator to pick up an external file for its template is somewhat clearly documented, but looking at the Pyt

4条回答
  •  既然无缘
    2021-01-31 05:40

    Unable to get a script file templated in python to work (new to python). But an example with bash operator is following, maybe that can give you some hints

    from datetime import datetime
    from airflow import DAG
    from airflow.operators.bash_operator import BashOperator
    
    default_args = {
        'owner': 'airflow',
        'depends_on_past': False,
        #'start_date': airflow.utils.dates.days_ago(2),
        'email': ['airflow@airflow.com']}
    
    dag = DAG('sr5', description='Simple tutorial DAG',
              schedule_interval='0 12 * * *',
              start_date=datetime(2017, 3, 20),
              catchup=False, #so that on scehduler restart, it doesn't try to catchup on all the missed runs
              template_searchpath=['/Users/my_name/Desktop/utils/airflow/resources'])
    
    t1 = BashOperator(
        task_id='t1',
        depends_on_past=False,
        params={
            'ds1': 'hie'},
        bash_command="01.sh",
        dag=dag)
    

    the 01.sh script looks like follows

    #!/bin/sh
    
    echo {{ ds }}
    echo {{ params.ds1 }}
    

    This give an output as follows on test execution

    [2017-05-12 08:31:52,981] {bash_operator.py:91} INFO - Output:

    [2017-05-12 08:31:52,984] {bash_operator.py:95} INFO - 2017-05-05

    [2017-05-12 08:31:52,984] {bash_operator.py:95} INFO - hie

提交回复
热议问题