Allowed memory size of 134217728 bytes exhausted (tried to allocate 31989760 bytes) In a do…while loop

≡放荡痞女 提交于 2019-12-11 23:55:22

问题


I have a do while code that reaches to an API to get data from AWS Elastic search. When I increase size, I get memory error. Now I have reduced size to 20, but want to loop through the results as a cronjob yet, it fails. I have increased memory to 4096 yet, it fails

I have increased memory limit in my .ini file, yet it fails. Since its a cronjob, I have no idea how to use scroll.

do {
  $transporter  = new TransporterController;
  $response = $transporter->dispatchSearch($paper->title, '55%', $size, $from, $rounds );
  $json_response = json_decode((string) $response->getBody(), true);
  $rounds = $json_response['rounds'];
  $from = $json_response['from'] + $json_response['size'];

  $response = $transporter->purify($response, $object);
  $impact_user_id = $paper->impact_user_id;
  $impact_type_id = 2;
  $response = $transporter->saveResult($response, $impact_user_id, $impact_type_id);
  $transporter->notifier($response, $user);
} while ($rounds > 1);

The idea is for the loop to run to completion, until last page. This is a laravel cronjob.


回答1:


Here are three methods to increase the limit on shared hosting:

  1. Just add this below line to before line of you getting error in your file

     ini_set('memory_limit', '-1');
    
  2. If you have access to your PHP.ini file, change the line in PHP.ini If your line shows 32M try 64M: memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)

  3. If you don't have access to PHP.ini try adding this to an .htaccess file: php_value memory_limit 64M

Also make sure your file pointer is valid, and pass "r" (= reading) for mode.



来源:https://stackoverflow.com/questions/54071276/allowed-memory-size-of-134217728-bytes-exhausted-tried-to-allocate-31989760-byt

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!