Airflow tasks get stuck at “queued” status and never gets running

假装没事ソ 提交于 2019-11-29 03:08:51

Tasks getting stuck is, most likely, a bug. At the moment (<= 1.9.0alpha1) it can happen when a task cannot even start up on the (remote) worker. This happens for example in the case of an overloaded worker or missing dependencies.

This patch should resolve that issue.

It is worth investigating why your tasks do not get a RUNNING state. Setting itself to this state is first thing a task does. Normally the worker does log before it starts executing and it also reports and errors. You should be able to find entries of this in the task log.

edit: As was mentioned in the comments on the original question in case one example of airflow not being able to run a task is when it cannot write to required locations. This makes it unable to proceed and tasks would get stuck. The patch fixes this by failing the task from the scheduler.

Rohan Sawant

I have been working on the same docker image puckel. My issue was resolved by :

Replacing

 result_backend = db+postgresql://airflow:airflow@postgres/airflow

with

celery_result_backend = db+postgresql://airflow:airflow@postgres/airflow

which I think is updated in the latest pull by puckel. The change was reverted around in Feb 2018 and your comment was made in January.

We have a solution and want to share it here before 1.9 becomes official. Thanks for Bolke de Bruin updates on 1.9. in my situation before 1.9, currently we are using 1.8.1 is to have another DAG running to clear the task in queue state if it stays there for over 30 mins.

Sheng Li

Please try airflow scheduler, airflow worker command.

I think airflow worker calls each task, airflow scheduler calls between two tasks.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!