An idea of how you can do it:
Create a file that will be responsible for the query alone, as if to schedule a task.
In the main file, where you run it today, it will do nothing but pass the responsibility to the new file.
One of the ways to do:
shell_exec('curl "http://localhost/executaquery.php?dados=" >> /dev/null &' )
This causes the script to run, pass the task, do not wait for return and stay free.
In your executaquery.php you can use sleep()
for the script to run its timeout
, it can take as long as you want, the previous process does not depend on it anymore.
This is just a way, maybe even a little bit dirty, to do. It will solve your problem and open your mind to more distributed processes.
Ideal in this case, is you understand a little about crontab, to learn ways to schedule tasks in the application. This will help you have queues for any steps in your process.
A practical example would be to want to do hundreds of queries at the same time, without waiting for the previous one to be solved.
Note that the &
causes the process to run in the background in the operating system, ie, will stack several processes that will be solved one by one.
for ($i = 1; $i <= 10000; $i++) {
shell_exec('php query.php &');
}
I talked a little bit about it in that reply.
I’ve never seen anything like this with php. They want to let the program flow faster the timeout interferes a lot in the script flow perhaps the best way to do this is by controlling these queries on the client side with javascript
– Atila Silva
@Atlasilva the problem is that I run this same query coming from various places, with the same Insert, but some of them, lock in the query. I needed it to debug.
– rbz
understand. Already tried to check operation memory consumption?
– Atila Silva
I even decided to tell the truth, it was an error in the object with commit and another with rollback, but I left the question to see if it exists...
– rbz