问题
This error message is being presented, any suggestions?
Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php
回答1:
At last I found the answer:
Just add this below line to before line of you getting error in your file
ini_set('memory_limit', '-1');
It will take unlimited memory usage of server, it's working fine.
Thanks for giving suggestion friends.
回答2:
Here are two simple methods to increase the limit on shared hosting:
If you have access to your PHP.ini file, change the line in PHP.ini If your line shows 32M try 64M:
memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)
If you don't have access to PHP.ini try adding this to an .htaccess file:
php_value memory_limit 64M
回答3:
Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.
Check for infinite loops.
If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;
Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...
回答4:
Doing :
ini_set('memory_limit', '-1');
is never good. If you want to read a very large file, it is a best practise to copy it bit by bit. Try the following code for best practise.
$path = 'path_to_file_.txt';
$file = fopen($path, 'r');
$len = 1024; // 1MB is reasonable for me. You can choose anything though, but do not make it too big
$output = fread( $file, $len );
while (!feof($file)) {
$output .= fread( $file, $len );
}
fclose($file);
echo 'Output is: ' . $output;
回答5:
It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach()
. It is much more memory efficient to use a while()
loop to fetch and process one row at a time. The same thing applies to processing a file.
回答6:
If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.
So you should read them bit by bit using fopen
& fread
.
回答7:
I have faced same problem in php7.2 with laravel 5.6. I just increase the amount of variable memory_limit = 128M
in php.ini as my applications demand. It might be 256M/512M/1048M.....Now it works fine.
回答8:
I was also having the same problem, looked for phpinfo.ini, php.ini or .htaccess files to no avail. Finally I have looked at some php files, opened them and checked the codes inside for memory. Finally this solution was what I come out with and it worked for me. I was using wordpress, so this solution might only work for wordpress memory size limit problem.
My solution, open default-constants.php file in /public_html/wp-includes folder. Open that file with code editor, and find memory settings under wp_initial_constants
scope, or just Ctrl+F it to find the word "memory". There you will come over WP_MEMORY_LIMIT
and WP_MAX_MEMORY_LIMIT
. Just increase it, it was 64 MB in my case, I increased it to 128 MB and then to 200 MB.
// Define memory limits.
if ( ! defined( 'WP_MEMORY_LIMIT' ) ) {
if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
define( 'WP_MEMORY_LIMIT', $current_limit );
} elseif ( is_multisite() ) {
define( 'WP_MEMORY_LIMIT', '200M' );
} else {
define( 'WP_MEMORY_LIMIT', '128M' );
}
}
if ( ! defined( 'WP_MAX_MEMORY_LIMIT' ) ) {
if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
} elseif ( -1 === $current_limit_int || $current_limit_int > 268435456 /* = 256M */ ) {
define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
} else {
define( 'WP_MAX_MEMORY_LIMIT', '256M' );
}
}
Btw, please don't do the following code, because that's bad practice:
ini_set('memory_limit', '-1');
回答9:
You can increase the memory allowed to php script by executing the following line above all the codes in the script:
ini_set('memory_limit','-1'); // enabled the full memory available.
And also de allocate the unwanted variables in the script.
Check this php library : Freeing memory with PHP
回答10:
I notice many answers just try to increase the amount of memory given to a script which has its place but more often than not it means that something is being too liberal with memory due to an unforseen amount of volume or size. Obviously if your not the author of a script your at the mercy of the author unless your feeling ambitious :) The PHP docs even say memory issues are due to "poorly written scripts"
It should be mentioned that ini_set('memory_limit', '-1');
(no limit) can cause server instability as 0 bytes free = bad things
. Instead, find a reasonable balance by what your script is trying to do and the amount of available memory on a machine.
A better approach: If you are the author of the script (or ambitious) you can debug such memory issues with xdebug. The latest version (2.6.0 - released 2018-01-29) brought back memory profiling that shows you what function calls are consuming large amounts of memory. It exposes issues in the script that are otherwise hard to find. Usually, the inefficiencies are in a loop that isn't expecting the volume it's receiving, but each case will be left as an exercise to the reader :)
The xdebug documentation is helpful, but it boils down to 3 steps:
- Install It - Available through
apt-get
andyum
etc - Configure it - xdebug.ini:
xdebug.profiler_enable = 1
,xdebug.profiler_output_dir = /where/ever/
- View the profiles in a tool like QCacheGrind, KCacheGrind
回答11:
If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.
As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.
回答12:
We had a similar situation and we tried out given at the top of the answers ini_set('memory_limit', '-1'); and everything worked fine, compressed images files greater than 1MB to KBs.
回答13:
ini_set('memory_limit', '-1');
回答14:
I was receiving the same error message after switching to a new theme in Wordpress. PHP was running version 5.3 so I switched to 7.2. That fixed the issue.
回答15:
If you are using a shared hosting, you will not be able to enforce the increment in the php size limit.
Just go to your cpanel and upgrade your php version to 7.1 and above then you are good to go.
回答16:
Write
ini_set('memory_limit', '-1');
in your index.php at the top after opening of php tag
回答17:
wordpress users add line:
@ini_set('memory_limit', '-1');
in wp-settings.php which you can find in the wordpress installed root folder
回答18:
I hadn't renewed my hosting and the database was read-only. Joomla needed to write the session and couldn't do it.
回答19:
I had the same issue which running php in command line. Recently, I had changes the php.ini file and did a mistake while changing the php.ini
This is for php7.0
path to php.ini where I made mistake:
/etc/php/7.0/cli/php.ini
I had set memory_limit = 256
(which means 256 bytes
)
instead of memory_limit = 256M
(which means 256 Mega bytes
).
; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limitmemory_limit = 128M
Once I corrected it, my process started running fine.
回答20:
Run this command in your Magento root directory php -d memory_limit=4G bin/magento
回答21:
if you are using laravel then use this ways
public function getClientsListApi(Request $request){
print_r($request->all()); //for all request
print_r($request->name); //for all name
}
instead of
public function getClientsListApi(Request $request){
print_r($request); // it show error as above mention
}
来源:https://stackoverflow.com/questions/415801/allowed-memory-size-of-33554432-bytes-exhausted-tried-to-allocate-43148176-byte