问题
I have a do while code that reaches to an API to get data from AWS Elastic search. When I increase size, I get memory error. Now I have reduced size to 20, but want to loop through the results as a cronjob yet, it fails. I have increased memory to 4096 yet, it fails
I have increased memory limit in my .ini file, yet it fails. Since its a cronjob, I have no idea how to use scroll.
do {
$transporter = new TransporterController;
$response = $transporter->dispatchSearch($paper->title, '55%', $size, $from, $rounds );
$json_response = json_decode((string) $response->getBody(), true);
$rounds = $json_response['rounds'];
$from = $json_response['from'] + $json_response['size'];
$response = $transporter->purify($response, $object);
$impact_user_id = $paper->impact_user_id;
$impact_type_id = 2;
$response = $transporter->saveResult($response, $impact_user_id, $impact_type_id);
$transporter->notifier($response, $user);
} while ($rounds > 1);
The idea is for the loop to run to completion, until last page. This is a laravel cronjob.
回答1:
Here are three methods to increase the limit on shared hosting:
Just add this below line to before line of you getting error in your file
ini_set('memory_limit', '-1');
If you have access to your PHP.ini file, change the line in PHP.ini If your line shows 32M try 64M: memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)
If you don't have access to PHP.ini try adding this to an .htaccess file: php_value memory_limit 64M
Also make sure your file pointer is valid, and pass "r" (= reading) for mode.
来源:https://stackoverflow.com/questions/54071276/allowed-memory-size-of-134217728-bytes-exhausted-tried-to-allocate-31989760-byt