Error 500 when bringing many records

Asked

Viewed 59 times

-1

I’m trying to display data in a table through a foreach in the php of more than 17,000 database records, but shows HTTP ERROR 500 in the browser.

 <?php foreach($dados as $dado):?>
   <tr>
      ... // aqui vai os registros
   </tr>
 <?php endforeach;?>

The interesting thing is that when I limit the amount of records to display, for example 5 thousand, the error does not occur.

Would that be a memory problem of php?

Maybe I shouldn’t use the foreach?

Edit:

Apache log:

[Fri Sep 21 10:06:00 2018] [error] [client 192.168.0.100] PHP Fatal error:  
Allowed memory size of 134217728 bytes exhausted (tried to allocate 27316224 
bytes) in Unknown on line 0, referer: 
http://192.168.0.100/index.php/relatorio/
  • 1

    You need to look at the apache error log.

  • thanks for the tip.. really set the memory limit. I should increase or treat this in php?

  • 1

    As for the negative seems a bit exaggerated, use the [Edit] link and add the log error message. The way it is, you can’t tell where the problem is.

  • 1

    The ideal would be not to bring all the records.

  • I edited with the log. Maybe breaking in parts sql and bringing gradually

  • 2

    The solution will be to paginate the data even, there is not much to do. The question is: do you really need 17 thousand records on the same page? This seems to be unnecessary. Up to 5 thousand seems to be too.

  • Yes, it’s a report and if I use paging, the excel plugin doesn’t work. Is there any way to clear the memory in each loop iteration?

  • 1

    A: You don’t need to create one array with all bank records. apparently you have in $dados all records, but usually the database driver implements a pointer scheme between records, making it possible to have only one record in memory at a time. Two: if PHP is generating an HTML code from the records, with 17,000 records the HTML will be gigantic, but you can facilitate the life of your server by releasing the output buffer with each iteration.

Show 3 more comments

1 answer

0


The error is happening because your script is consuming more resources than php is set to allow. You could raise the limit, but I wouldn’t recommend it because it’s the equivalent of throwing the dust under the mat.

The ideal would be for you to inspect your code to find out where the memory leak is. I already went through a similar problem and refactoring the code I was able to reduce the memory usage from 3GB to 300MB. One of the changes I made was to exchange the foreach for a for as follows:

for ($i=0, $n = count($dados); $i < $n; $i++){
    $dado = $dados[$i]; //Instancia uma variável para trabalhar nesse escopo

    ... //Sua lógica aqui, usando $dado;

    unset($dados[$i]); //Libera memória a cada interação

}

You can measure memory consumption by adding memory_get_usage() at the beginning or end of each loop interaction.

echo ((memory_get_usage(true)/1024)/1024) . 'MB';

NOTE: I read today that the use of a loop foreach ($foo as &$bar) is lighter when using heavy data arrays. But use with caution as this same syntax tends to be slower when using little data.

  • 1

    If you measure the running time will see that count() weighs more. Hardly vc will have any problem of Leak in a script or scripting language.

  • rray, have you done any benchmarking that might have checked this? With my data volume, this was the lightest implementation in terms of memory usage (although in many other situations the common foreach is the lightest). In fact, the speed of execution was not even taken into consideration by me, since the author did not present any timeout problems or anything like.

  • 1

    You can see the result with these links https://3v4l.org/Xm0Jo#output, https://3v4l.org/Z6RVX, https://3v4l.org/efW7a. You can test on your machine with other options see => https://ideone.com/FpmiA5

  • Grateful for the links and the site for testing (I did not know hehe). But as I said, there are situations where one implementation can get faster than another - and vice versa. In my case, Count() performed better than foreach in terms of memory consumption. So the ideal is always to test all the alternatives and see which one suits your scenario best. Knowledge is ammunition. Peace.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.