Airflow - Python file NOT in the same DAG folder

后端 未结 4 725
抹茶落季
抹茶落季 2021-02-05 06:14

I am trying to use Airflow to execute a simple task python.

from __future__ import print_function
from airflow.operators.python_operator import PythonOperator
f         


        
4条回答
  •  挽巷
    挽巷 (楼主)
    2021-02-05 06:55

    You can package dependencies of your DAG as per:

    https://airflow.apache.org/concepts.html#packaged-dags

    To allow this you can create a zip file that contains the dag(s) in the root of the zip file and have the extra modules unpacked in directories. For instance you can create a zip file that looks like this:

    my_dag1.py
    my_dag2.py
    package1/__init__.py
    package1/functions.py
    

    Airflow will scan the zip file and try to load my_dag1.py and my_dag2.py. It will not go into subdirectories as these are considered to be potential packages.

    When using CeleryExecutor, you need to manually sync DAG directories, Airflow doesn't take care of that for you:

    https://airflow.apache.org/configuration.html?highlight=scaling%20out%20celery#scaling-out-with-celery

    The worker needs to have access to its DAGS_FOLDER, and you need to synchronize the filesystems by your own means

提交回复
热议问题