Return multiple arrays using a buffer without overloading memory

Asked

Viewed 650 times

3

I’m developing an application where, in a report, I need to print something around 50,000 lines. When having run SELECT error occurs:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 24 bytes) in C: wamp www simplaFramework System Framework Model.class.class.php on line 884.

The following query result is being saved in a array 'cause I’ll treat you later:

        $data = array();
        $count = 0;
        while ($row = $q->fetch($pdo_fetch)) {                     
            $data[] = $row;                    
            ++$count;
        } 

My idea was to divide the result into 2 or more arrays whenever I get an X amount of memory usage. For example: when use approaches 128mb (get this value with memory_get_usage()) divide the result into 2 or more arrays and return the result.

But how can I do that? Should I use a Buffer?

  • I’m already trying to use fetchAll() in place of fetch() ? If no condition to increase $count, use function count() passing as argument the array of fetchAll().

  • The problem is not how I recover the data in the database but when storing the data in the variable: $data[] = $row;. It exceeds the memory size of PHP (128M), and I need to pass the data to another function before printing.

4 answers

3


This problem happens because normally PHP uses buffered queries. This means that PHP receives all the result lines before returning the query call.

To solve the problem you must use Unbuffered queries. So PHP returns a result line at a time for the application to process without storing everything first in memory.

$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);

Care needs to be taken because using Unbuffered queries, this implies leaving a transaction open while not returning all results. This means you can block or delay access to tables by other parallel queries.

Another consequence is that in some types of databases you may not be able to execute another query until you return all results.

An alternative solution to avoid these problems use Unbuffered queries and save the results to a local file, for example in CSV format, with the result lines until the end, and then re-read the file to process the results one by one, avoiding leaving the transaction open and freeing the connection to perform other queries while processing the results.

  • 1

    Thank you @mlemos, your reply clarified exactly my doubts, although the other answers were satisfactory.

3

Implement a pagination on queries, that is, collect the result of 1000 in 1000 lines (for example) using the clause LIMIT mysql.

SELECT ... LIMIT 0, 1000

SELECT ... LIMIT 1000, 1000

SELECT ... LIMIT 2000, 1000

(etc..)

In my case, I would run a first query just to count the total of rows and then divide that total by the number of results per query. Then just say what I suggested above :)

3

I don’t know much about buffers in PHP, but I think the correct thing in this case would be to "drop" what is stored in the array, and then go get the rest.

I think an interesting option would be the use of ajax in this case, since you can go to the server, search for an amount of data that you limit, print on the screen and, if you have not completed the request completely, return by ajax to the server and take a further amount of data.

For this you would use only a flag, indicating whether the processing on the server came to an end or not. If it hasn’t arrived you ask the ajax to go to the server again and continue from where you left off.

You can even use the tip that friend @rodrigorigotti, splitting the query result into pieces and then returning from where it left off.

3

If you can break the records use pagination limit/off set as quoted by @rodrigorigotti

otherwise try one of the following approaches.

Possible solutions

Faster:

Increase the memory limit of the script (use in urgent cases), put this code at the beginning of the script.

ini_set('memory_limit', '512M');

Fetchall

Changing the code to use fetchAll eliminates the for the N assignments. Remember that it is necessary to use preparedStaments.

$registros = $q->fetchAll();
$total = count($registros);
  • Now I understand what you mean. When should I use fetchAll and fetch it?

  • The difference between them is that fetch() returns only one equation record fetchAll() returns all at once.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.