How to do logging with multiple django WSGI processes + celery on the same webserver

三世轮回 提交于 2020-01-05 08:43:00

问题


I've got a mod_wsgi server setup with 5 processes and a celery worker queue (2 of them) all on the same VM. I'm running into problems where the loggers are stepping on each other and while it appears there are some solutions if you are using python multiprocessing, I don't see how that applies to mod_wsgi processes combined also with celery processes.

What is everyone else doing with this problem? The celery tasks are using code that logs in the same files as the webserver code.

Do I somehow have to add a pid to the logfilename? That seems like it could get messy fast with lots of logfiles with unique names and no real coherent way to pull them all back together.

Do I have to write a log daemon that allows all the processes to log to it? If so, where do you start it up so that it is ready for all of the processes that might want to log.....

Surely there is some kind of sane pattern out there for this, I just don't know what it is yet.


回答1:


As mentioned in the docs, you could use a separate server process which listens on a socket and logs to different destinations, and has whatever logging configuration you want (in terms of files, console and so on). The other processes just configure a SocketHandler to send their events to the server process. This is generally better than separate log files with pids in their filenames.

The logging docs contain an example socket server implementation which you can adapt to your needs.



来源:https://stackoverflow.com/questions/19235402/how-to-do-logging-with-multiple-django-wsgi-processes-celery-on-the-same-webse

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!