[php] Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php

This error message is being presented, any suggestions?

Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php

This question is related to php memory-management memory-limit

The answer is


If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.

As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.


I was also having the same problem, looked for phpinfo.ini, php.ini or .htaccess files to no avail. Finally I have looked at some php files, opened them and checked the codes inside for memory. Finally this solution was what I come out with and it worked for me. I was using wordpress, so this solution might only work for wordpress memory size limit problem. My solution, open default-constants.php file in /public_html/wp-includes folder. Open that file with code editor, and find memory settings under wp_initial_constants scope, or just Ctrl+F it to find the word "memory". There you will come over WP_MEMORY_LIMIT and WP_MAX_MEMORY_LIMIT. Just increase it, it was 64 MB in my case, I increased it to 128 MB and then to 200 MB.

// Define memory limits.
if ( ! defined( 'WP_MEMORY_LIMIT' ) ) {
    if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
        define( 'WP_MEMORY_LIMIT', $current_limit );
    } elseif ( is_multisite() ) {
        define( 'WP_MEMORY_LIMIT', '200M' );
    } else {
        define( 'WP_MEMORY_LIMIT', '128M' );
    }
}

if ( ! defined( 'WP_MAX_MEMORY_LIMIT' ) ) {
    if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
        define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
    } elseif ( -1 === $current_limit_int || $current_limit_int > 268435456 /* = 256M */ ) {
        define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
    } else {
        define( 'WP_MAX_MEMORY_LIMIT', '256M' );
    }
}

Btw, please don't do the following code, because that's bad practice:

ini_set('memory_limit', '-1');

I had the same issue which running php in command line. Recently, I had changes the php.ini file and did a mistake while changing the php.ini

This is for php7.0

path to php.ini where I made mistake: /etc/php/7.0/cli/php.ini

I had set memory_limit = 256 (which means 256 bytes)
instead of memory_limit = 256M (which means 256 Mega bytes).

; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 128M

Once I corrected it, my process started running fine.


Here are two simple methods to increase the limit on shared hosting:

  1. If you have access to your PHP.ini file, change the line in PHP.ini If your line shows 32M try 64M: memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)

  2. If you don't have access to PHP.ini try adding this to an .htaccess file: php_value memory_limit 64M


if you are using laravel then use this ways

public function getClientsListApi(Request $request){ print_r($request->all()); //for all request print_r($request->name); //for all name }

instead of

public function getClientsListApi(Request $request){ print_r($request); // it show error as above mention }


You can increase the memory allowed to php script by executing the following line above all the codes in the script:

ini_set('memory_limit','-1'); // enabled the full memory available.

And also de allocate the unwanted variables in the script.

Check this php library : Freeing memory with PHP


Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.

Check for infinite loops.

If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;

Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...


If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.

So you should read them bit by bit using fopen & fread.


Write

ini_set('memory_limit', '-1');

in your index.php at the top after opening of php tag


Here are two simple methods to increase the limit on shared hosting:

  1. If you have access to your PHP.ini file, change the line in PHP.ini If your line shows 32M try 64M: memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)

  2. If you don't have access to PHP.ini try adding this to an .htaccess file: php_value memory_limit 64M


I hadn't renewed my hosting and the database was read-only. Joomla needed to write the session and couldn't do it.


If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.

As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.


It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.


ini_set('memory_limit', '-1');

Run this command in your Magento root directory php -d memory_limit=4G bin/magento


Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.

Check for infinite loops.

If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;

Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...


Run this command in your Magento root directory php -d memory_limit=4G bin/magento


I was receiving the same error message after switching to a new theme in Wordpress. PHP was running version 5.3 so I switched to 7.2. That fixed the issue.


If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.

So you should read them bit by bit using fopen & fread.


You can increase the memory allowed to php script by executing the following line above all the codes in the script:

ini_set('memory_limit','-1'); // enabled the full memory available.

And also de allocate the unwanted variables in the script.

Check this php library : Freeing memory with PHP


Doing :

ini_set('memory_limit', '-1');

is never good. If you want to read a very large file, it is a best practise to copy it bit by bit. Try the following code for best practise.

$path = 'path_to_file_.txt';

$file = fopen($path, 'r');
$len = 1024; // 1MB is reasonable for me. You can choose anything though, but do not make it too big
$output = fread( $file, $len );

while (!feof($file)) {
    $output .= fread( $file, $len );
}

fclose($file);

echo 'Output is: ' . $output;

We had a similar situation and we tried out given at the top of the answers ini_set('memory_limit', '-1'); and everything worked fine, compressed images files greater than 1MB to KBs.


It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.


ini_set('memory_limit', '-1');

We had a similar situation and we tried out given at the top of the answers ini_set('memory_limit', '-1'); and everything worked fine, compressed images files greater than 1MB to KBs.


I hadn't renewed my hosting and the database was read-only. Joomla needed to write the session and couldn't do it.


I was also having the same problem, looked for phpinfo.ini, php.ini or .htaccess files to no avail. Finally I have looked at some php files, opened them and checked the codes inside for memory. Finally this solution was what I come out with and it worked for me. I was using wordpress, so this solution might only work for wordpress memory size limit problem. My solution, open default-constants.php file in /public_html/wp-includes folder. Open that file with code editor, and find memory settings under wp_initial_constants scope, or just Ctrl+F it to find the word "memory". There you will come over WP_MEMORY_LIMIT and WP_MAX_MEMORY_LIMIT. Just increase it, it was 64 MB in my case, I increased it to 128 MB and then to 200 MB.

// Define memory limits.
if ( ! defined( 'WP_MEMORY_LIMIT' ) ) {
    if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
        define( 'WP_MEMORY_LIMIT', $current_limit );
    } elseif ( is_multisite() ) {
        define( 'WP_MEMORY_LIMIT', '200M' );
    } else {
        define( 'WP_MEMORY_LIMIT', '128M' );
    }
}

if ( ! defined( 'WP_MAX_MEMORY_LIMIT' ) ) {
    if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
        define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
    } elseif ( -1 === $current_limit_int || $current_limit_int > 268435456 /* = 256M */ ) {
        define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
    } else {
        define( 'WP_MAX_MEMORY_LIMIT', '256M' );
    }
}

Btw, please don't do the following code, because that's bad practice:

ini_set('memory_limit', '-1');

wordpress users add line:

@ini_set('memory_limit', '-1');

in wp-settings.php which you can find in the wordpress installed root folder


I have faced same problem in php7.2 with laravel 5.6. I just increase the amount of variable memory_limit = 128M in php.ini as my applications demand. It might be 256M/512M/1048M.....Now it works fine.


I notice many answers just try to increase the amount of memory given to a script which has its place but more often than not it means that something is being too liberal with memory due to an unforseen amount of volume or size. Obviously if your not the author of a script your at the mercy of the author unless your feeling ambitious :) The PHP docs even say memory issues are due to "poorly written scripts"

It should be mentioned that ini_set('memory_limit', '-1'); (no limit) can cause server instability as 0 bytes free = bad things. Instead, find a reasonable balance by what your script is trying to do and the amount of available memory on a machine.

A better approach: If you are the author of the script (or ambitious) you can debug such memory issues with xdebug. The latest version (2.6.0 - released 2018-01-29) brought back memory profiling that shows you what function calls are consuming large amounts of memory. It exposes issues in the script that are otherwise hard to find. Usually, the inefficiencies are in a loop that isn't expecting the volume it's receiving, but each case will be left as an exercise to the reader :)

The xdebug documentation is helpful, but it boils down to 3 steps:

  1. Install It - Available through apt-get and yum etc
  2. Configure it - xdebug.ini: xdebug.profiler_enable = 1, xdebug.profiler_output_dir = /where/ever/
  3. View the profiles in a tool like QCacheGrind, KCacheGrind

Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.

Check for infinite loops.

If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;

Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...


If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.

So you should read them bit by bit using fopen & fread.


It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.


Write

ini_set('memory_limit', '-1');

in your index.php at the top after opening of php tag


I had the same issue which running php in command line. Recently, I had changes the php.ini file and did a mistake while changing the php.ini

This is for php7.0

path to php.ini where I made mistake: /etc/php/7.0/cli/php.ini

I had set memory_limit = 256 (which means 256 bytes)
instead of memory_limit = 256M (which means 256 Mega bytes).

; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 128M

Once I corrected it, my process started running fine.


If you are using a shared hosting, you will not be able to enforce the increment in the php size limit.

Just go to your cpanel and upgrade your php version to 7.1 and above then you are good to go.


I want to share my experience on this issue!

Suppose you have a class A and class B.

class A {
    protected $userB;

    public function __construct() {
        $this->userB = new B();
    }
}

class B {
    protected $userA;

    public function __construct() {
        $this->userA = new A();
    }
}

this will initiate a chain formation of objects which may be create this kind of issue!


I notice many answers just try to increase the amount of memory given to a script which has its place but more often than not it means that something is being too liberal with memory due to an unforseen amount of volume or size. Obviously if your not the author of a script your at the mercy of the author unless your feeling ambitious :) The PHP docs even say memory issues are due to "poorly written scripts"

It should be mentioned that ini_set('memory_limit', '-1'); (no limit) can cause server instability as 0 bytes free = bad things. Instead, find a reasonable balance by what your script is trying to do and the amount of available memory on a machine.

A better approach: If you are the author of the script (or ambitious) you can debug such memory issues with xdebug. The latest version (2.6.0 - released 2018-01-29) brought back memory profiling that shows you what function calls are consuming large amounts of memory. It exposes issues in the script that are otherwise hard to find. Usually, the inefficiencies are in a loop that isn't expecting the volume it's receiving, but each case will be left as an exercise to the reader :)

The xdebug documentation is helpful, but it boils down to 3 steps:

  1. Install It - Available through apt-get and yum etc
  2. Configure it - xdebug.ini: xdebug.profiler_enable = 1, xdebug.profiler_output_dir = /where/ever/
  3. View the profiles in a tool like QCacheGrind, KCacheGrind

I was receiving the same error message after switching to a new theme in Wordpress. PHP was running version 5.3 so I switched to 7.2. That fixed the issue.


It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.


If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.

So you should read them bit by bit using fopen & fread.


I want to share my experience on this issue!

Suppose you have a class A and class B.

class A {
    protected $userB;

    public function __construct() {
        $this->userB = new B();
    }
}

class B {
    protected $userA;

    public function __construct() {
        $this->userA = new A();
    }
}

this will initiate a chain formation of objects which may be create this kind of issue!


I have faced same problem in php7.2 with laravel 5.6. I just increase the amount of variable memory_limit = 128M in php.ini as my applications demand. It might be 256M/512M/1048M.....Now it works fine.


Doing :

ini_set('memory_limit', '-1');

is never good. If you want to read a very large file, it is a best practise to copy it bit by bit. Try the following code for best practise.

$path = 'path_to_file_.txt';

$file = fopen($path, 'r');
$len = 1024; // 1MB is reasonable for me. You can choose anything though, but do not make it too big
$output = fread( $file, $len );

while (!feof($file)) {
    $output .= fread( $file, $len );
}

fclose($file);

echo 'Output is: ' . $output;

If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.

As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.


If you are using a shared hosting, you will not be able to enforce the increment in the php size limit.

Just go to your cpanel and upgrade your php version to 7.1 and above then you are good to go.


Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.

Check for infinite loops.

If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;

Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...


if you are using laravel then use this ways

public function getClientsListApi(Request $request){ print_r($request->all()); //for all request print_r($request->name); //for all name }

instead of

public function getClientsListApi(Request $request){ print_r($request); // it show error as above mention }


wordpress users add line:

@ini_set('memory_limit', '-1');

in wp-settings.php which you can find in the wordpress installed root folder


Examples related to php

I am receiving warning in Facebook Application using PHP SDK Pass PDO prepared statement to variables Parse error: syntax error, unexpected [ Preg_match backtrack error Removing "http://" from a string How do I hide the PHP explode delimiter from submitted form results? Problems with installation of Google App Engine SDK for php in OS X Laravel 4 with Sentry 2 add user to a group on Registration php & mysql query not echoing in html with tags? How do I show a message in the foreach loop?

Examples related to memory-management

When to create variables (memory management) How to check if pytorch is using the GPU? How to delete multiple pandas (python) dataframes from memory to save RAM? Is there a way to delete created variables, functions, etc from the memory of the interpreter? C++ error : terminate called after throwing an instance of 'std::bad_alloc' How to delete object? Android Studio - How to increase Allocated Heap Size Implementing IDisposable correctly Calculating Page Table Size Pointer-to-pointer dynamic two-dimensional array

Examples related to memory-limit

Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 71 bytes) How to increase memory limit for PHP over 2GB? checking memory_limit in PHP PHP : settings memory_limits > 1024M does not work Generate an integer that is not among four billion given ones Allowed memory size of X bytes exhausted Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (CodeIgniter + XML-RPC) Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php