log doesn't work in production with delayed job

后端 未结 3 790
夕颜
夕颜 2021-01-01 07:01

I\'m suffering some weird issue where my delayed_jobs are failing in production. Finally I narrowed it down to the logger. If I comment out my log function calls, everythi

相关标签:
3条回答
  • 2021-01-01 07:29

    I've had a similar issue. The logging in the collectiveidea fork is being rewritten - see this issue for more info. Perhaps try the newest version to see if that fixes it for you.

    0 讨论(0)
  • 2021-01-01 07:40

    It looks like you're trying to log to a closed file. Have you perhaps considered trying the SyslogLogger gem?

    Here's an article on how to use it with rails, which should help get you started.

    0 讨论(0)
  • 2021-01-01 07:46

    Woody Peterson found the problem here: http://groups.google.com/group/delayed_job/browse_thread/thread/f7d0534bb6c7c83f/37b4e8ed7bfaba42

    The problem is:

    DJ is using Rails' buffered log in production, and flushing the buffer is not being triggered for some reason (don't know if it's flushed by buffer size or explicitly flushed after a request).

    The temporary fix (credit to Nathan Phelps) is:

    When in production the buffered log is set to an auto_flushing value of 1000 which means that flush is not called until 1000 messages have been logged. Assuming you're using collectiveidea's fork of delayed_job, you can address this by setting auto_flushing to a more reasonable value in command.rb right after the logger is initialized on line 64. I.E.

    Delayed::Worker.logger = Rails.logger

    Delayed::Worker.logger.auto_flushing = 1 # or whatever

    Works for me perfectly!

    0 讨论(0)
提交回复
热议问题