1
I have a script running on my server through Supervisord. This script queries the database for pending request processing, and when it does, sends the data of each request to an endpoint of a client’s web service.
For each request, I use the file_get_contents
to open a file, because I need to send the image data through the base_64_encode
.
i log an error in case the processing fails. I have received the following error, referring to file_get_contents
Failed to open stream: Too Many open files
I did not understand very well why this error, since each request is processed one by one. And so, theoretically, the file_get_contents
, after reading the contents of the file, closes it right after.
So why does this error occur? How can I fix this?
Note: I am using Ubuntu 14 and, as I researched about the error, it seems to be a configuration on Linux, but I did not understand it very well.
It wouldn’t be because the contents of the file overflowed the buffer?
– Guilherme Lautert
@Guilhermelautert I read here that it is related to a certain
ulimit
...– Wallace Maxters
@Guilhermelautert good idea, instead of opening everything at once with
file_get_contents
, could be a good usefopen
and carry on with thefread
... I’ll have to change the logic, maybe– Wallace Maxters