2
What is the best way to run the same php script several times by making random queries for a given column with a limit of 1000 per query of the same mysql table via cron (Cpanel) without generating so much overhead?
[cron 1]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 2]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 3]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 4]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 5]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 6]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 7]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 8]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 9]
* * * * * curl -s http://[::1]/dir/arquivo.php
[cron 10]
* * * * * curl -s http://[::1]/dir/arquivo.php
The first thing is to start running this url from there. If you saw this as an example in some post here by Sopt for access to the local machine, let us know what it was so we could take a look, because it’s absurd. PHP has the command line executable that is much more suitable for this (and will not be locking web server task). Take an example here: http://answall.com/a/56171/70 and another one here: http://answall.com/a/124519/70 - About the fact that there are 10 cron entries, I can’t imagine a reason for this. If you can elaborate better, it might help clarify the question.
– Bacco
Hi @Bacco, thanks for your willingness to help. 1. I ask you to look at the url :) I used /usr/bin/php before investing in Curl. Several places suggest these two methods, besides Lynx and wget. But what I’m trying to figure out is how to avoid or mitigate the overhead on the server. If that’s possible. The idea of having ten entries is to be able to process 1,500,000 lines per day with the forecast to double this soon...
– André Miani
This is usually for accessing third party stuff. Locally if someone uses Lynx, wget and Curl having the PHP executable, better distrust. Simply you will be occupying the page server, and forcing PHP to have timeout.
– Bacco
But @Bacco, besides changing this stop of Curl, normal run n times the same script that will pull randomly the records of a seller among dozens for me to be able to process faster all records?
– André Miani
The more crontab entry, the worse it is. It will only be competing between the side tasks. If there’s something time consuming in your php, that’s where you’ll have to fix it. But with what was posted in the question, there’s not much to say. And you need to see if PHP is the right tool for what you want to do.
– Bacco