I am trying to use Airflow to execute a simple task python.
from __future__ import print_function
from airflow.operators.python_operator import PythonOperator
f
For your second question : How Airflow+Celery is going to distribute all necessary python sources files across the worker nodes?
From documentation : The worker needs to have access to its DAGS_FOLDER, and you need to synchronize the filesystems by your own mean. A common setup would be to store your DAGS_FOLDER in a Git repository and sync it across machines using Chef, Puppet, Ansible, or whatever you use to configure machines in your environment. If all your boxes have a common mount point, having your pipelines files shared there should work as well
http://pythonhosted.org/airflow/installation.html?highlight=chef