Airflow Worker Daemon exits for no visible reason

故事扮演 提交于 2019-12-20 01:43:09

问题


I have Airflow 1.9 running inside a virtual environment, set up with Celery and Redis and it works well. However, I wanted to daemon-ize the set up and used the instructions here. It works well for the Webserver, Scheduler and Flower, but fails for the Worker, which is of course, the core of it all. My airflow-worker.service file looks like this:

[Unit]
Description=Airflow celery worker daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service

[Service]
EnvironmentFile=/etc/default/airflow
User=root
Group=root
Type=simple
ExecStart=/bin/bash -c 'source /home/codingincircles/airflow-master/bin/activate ; airflow worker'
Restart=on-failure
RestartSec=10s

[Install]
WantedBy=multi-user.target

Curiously, if I run the ExecStart command on the CLI as is, it runs perfectly and tasks run and everything is glorious. However, when I do a sudo service airflow-worker start, it takes a while to return to prompt and nothing shows up in the Flower UI. When I do journalctl -u airflow-worker.service -e, this is what I see:

systemd[1]: Started Airflow celery worker daemon.
bash[12392]: [2018-04-09 21:52:41,202] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python3.5/lib2to3/Grammar.txt
bash[12392]: [2018-04-09 21:52:41,252] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
bash[12392]: [2018-04-09 21:52:41,578] {configuration.py:206} WARNING - section/key [celery/celery_ssl_active] not found in config
bash[12392]: [2018-04-09 21:52:41,578] {default_celery.py:41} WARNING - Celery Executor will run without SSL
bash[12392]: [2018-04-09 21:52:41,579] {__init__.py:45} INFO - Using executor CeleryExecutor
systemd[1]: airflow-worker.service: Main process exited, code=exited, status=1/FAILURE
systemd[1]: airflow-worker.service: Unit entered failed state.
systemd[1]: airflow-worker.service: Failed with result 'exit-code'.

What am I doing wrong? Any other method of using Airflow works, except when I try to daemon-ize it. Even using the -D flag after the airflow commands works (like airflow worker -D), except I'm not sure if that is the right/safe/recommended way of using it in production and would rather make it a service and use it. Please help.


回答1:


Your airflow-worker.service is trying to run the airflow worker as the root user. In order to run airflow worker as root, you must set C_FORCE_ROOT="true" in your airflow environment file (/etc/default/airflow). However, this is not recommended and I suspect it is not the best fix for you.

When trying to run airflow worker manually as root you should see a warning about this. Because you have not seen this warning I suspect that you are able to start the worker without issue manually because you are running it as a properly configured airflow user and not as root. Thus, the recommended solution would be to change the following lines in your airflow-worker.service file:

User=airflow
Group=airflow


来源:https://stackoverflow.com/questions/49742296/airflow-worker-daemon-exits-for-no-visible-reason

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!