JOB_TOO_BIG Pheanstalk - what can be done?

帅比萌擦擦* 提交于 2019-12-22 05:13:35

问题


On Laravel 4.2 & Laravel Forge

I Made a mistake and accidentally pushed some code on to the production sever, but there was a bug and it pushed a job to the queue without deleting it once done. Now I can't push anything in the queue anymore, I get:

Pheanstalk_Exception JOB_TOO_BIG: job data exceeds server-enforced limit

What can I do?


回答1:


This is because you're trying to store too much data in the queue itself. Try to cut down the data you're pushing to the queue.

For example if your queue job involves using models, just pass the model ID into the queue and as part of the job fetch them from the database, rather than passing the queue the entire model instance.

If you're using eloquent models, they're automatically handled in this way.




回答2:


You can increase the max job size with the -z option for Beanstalkd: http://linux.die.net/man/1/beanstalkd

To do this on Forge you need to SSH into the server and edit the /etc/default/beanstalkd file.

Add the following line (or uncomment the existing BEANSTALKD_EXTRA line and edit it): BEANSTALKD_EXTRA="-z 524280"

Restart beanstalkd after making the change: sudo service beanstalkd restart

The size should be specified in bytes.

I am not sure if this could have serious performance effects - so far, so good for me. I would appreciate any comments on performance.



来源:https://stackoverflow.com/questions/29199302/job-too-big-pheanstalk-what-can-be-done

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!