7
votes

My project uses an open source PHP MySQL library https://github.com/ajillion/PHP-MySQLi-Database-Class

But the project mid-year report: "Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4294967296 bytes) in / home1/flipalbu/public_html/kvisofttest/login-admin/Lib/class.MysqliDb.php on line 422" This error ,

My server is: linux x86_64

PHP Version 5.4.17

Mysql Version: 5.5.32

memory_limit = 128M

Line 422:call_user_func_array (array ($ stmt, 'bind_result'), $ parameters);

Query part of the code:

    $ db = new MysqliDb ('LocalHost', 'root', 'PASSWD', 'DB');
$ wqdb = $ db-> query ("SELECT * FROM db_table");
foreach ($ wqdb as $ row) {
     $ con. = $ row ['ID'];
}
echo $ con;

Is there any way to solve it?


/** Error Code **/

 protected function _dynamicBindResults(mysqli_stmt $stmt)
        {
            $parameters = array();
            $results = array();

            $meta = $stmt->result_metadata();

            $row = array();
            while ($field = $meta->fetch_field()) {
                $row[$field->name] = null;
                $parameters[] = & $row[$field->name];
            }

            call_user_func_array(array($stmt, 'bind_result'), $parameters);

            while ($stmt->fetch()) {
                $x = array();
                foreach ($row as $key => $val) {
                    $x[$key] = $val;
                }
                array_push($results, $x);
            }
            return $results;
        }
4
You’re exhausting the memory PHP has available. Presumably because your database table is massive and you’re just selecting all records without a condition or limit clause.Martin Bean
How many rows are you fetching?halfer
There is - don't select everything from your database.N.B.
Increase your memory limit to maximum by ini_set('memory_limit', '-1');Deepu
@User016, no; that might a solution on the command line (and even then, only rarely). However this looks a web application, and you don't want instances to take up large chunks of memory. The query needs to be optimised.halfer

4 Answers

24
votes

I read this bug report here: https://bugs.php.net/bug.php?id=51386

Your problem seems to happen because there is a longblob or longtext in the columns of the table.

longtext / longblob have a maximum length of 4294967295 [4GB] thats why mysqli tries to allocated that memory for the buffer to be sure nothing is lost. I would suggest that you use mediumtext (16777215 [16MB] max length), that should be enough for everything usually.

Update: Because this answer has seen some activity I add this solution from Phil_1984 (see comments)

I use mysqli and after reading that quote from php dev, adding a $stmt->store_result(); between execute and bind_result seems to fix the issues for me

=> If you use $stmt->store_result() you can use mysqli with longblob / longtext without getting the error.

-

Old Answer: I suggest that you either change the column to another type (mediumtext) or use PDO (i think it doesnt have that problem). but if you want to keep the column as longtext, you have to switch your mysql library

Quote from PHP Dev:

This is a known limitation of ext/mysqli when using libmysql (always in 5.2 and previous) and when libmysql is enabled with 5.3 . The reason is that the server sends not too specific metadata about the column. This longtext has a max length of 4G and ext/mysqli tries to bind with the max length, to be sure no data loss occurs (data doesn't fit in the bind buffer on C level). However, that means 4G for a longtext/longblob column. ext/mysqli has been changed to have a way to work around that. You need to call mysqli_stmt_store_result() which will store the data locally, which means, of course a higher memory usage for PHP. However, because you use libmysql this won't hit the PHP's memory limit, for sure. During store_result the max_length of every column will be calculated and then when bind_result is executed only a buffer with size of max_length will be allocated, which will be definitely lower than 4G. In short, prepare execute store_result bind_result fetch...fetch...fetch

4
votes

If you're trying to read in an entire table in one go, and that table has a lot of rows and columns then running out of memory is inevitable. You can put it off by upping the memory limit in php.ini but the problem will only reoccur when you add a few thousand more rows.

You need to rewrite your script to be more sensible about what it fetches. If you only need specific records then pulling down the entire table and looking for the row(s) you want in the result set is horrible inefficient. Use a WHERE clause to specify what you really want to get. The rule of thumb with PHP/SQL applications is "Use SQL wherever possible to specify what you want, then do what you need to do with it in PHP".

Of course it may be that there's an entirely legitimate reason why you need to process an entire table in PHP. In that case, you should fetch the data in chunks (say 100 rows at a time) with LIMIT and OFFSET, process those rows, get the next chunk, process those and so on until you've gone through the entire table. That will far less memory than trying to load the entire table at once

1
votes

Doens't seem like a huge table! Seems like a endless loop! It tries to allocate about 4gb, i dont think you have such a big table....

check that you don't create a loop here:

call_user_func_array (array ($ stmt, 'bind_result'), $ parameters);

maybe you should post the code that is around this line.

0
votes

You are exceeding the maximun available memory. Yow have two options:

  • Increase the maximum allowed memory to each PHP script either by configuration (memory_limit directive in php.ini) or in execution time by using ini_set('memory_limit', '200M')

  • Improve the code to handle only the required information.