One thing is a function run asynchronously, another thing is to make a call to something external without expecting a reply (or catch the answer later), I will not go into detail because to be honest I do not know the background Guzzle.
PHP only really supports Threads with Pthread, still it is possible to simulate something. Personally I see no need to run a function asynchronously in a PHP script processing a web page, maybe I don’t quite understand the need for this, I think maybe in an application CLI (Command-line interface/command line interface) that uses PHP would be more interesting, since it could perform multiple tasks and run endlessly (or until all "events" end).
Call a PHP script without waiting for it to respond
Speaking regardless of the need, an example which would be something similar to "asynchronous" (in fact it’s a totally separate process) would be to call a PHP script through another script in CLI mode, would look something like:
/**
* Iniciar script em outro processo
*
* @param string $script Define a localização do script
* @param string $php_exe Define a localização do interpretador (opcional)
* @param string $php_ini Define a localização do php.ini (opcional)
* @return void
*/
function processPhpScript($script, $php_exe = 'php', $php_ini = null) {
$script = realpath($script);
$php_ini = $php_ini ? $php_ini : php_ini_loaded_file();
if (stripos(PHP_OS, 'WIN') !== false) {
/* Windows OS */
$exec = 'start /B cmd /S /C ' . escapeshellarg($php_exe . ' -c ' . $php_ini . ' ' . $script) . ' > NUL';
} else {
/* nix OS */
$exec = escapeshellarg($php_exe) . ' -c ' . escapeshellarg($php_ini) . ' ' . escapeshellarg($script . ' >/dev/null 2>&1');
}
$handle = popen($exec, 'r');
if ($handle) {
pclose($handle);
}
}
processPhpScript('pasta/script1.php');
processPhpScript('pasta/script2.php');
If it is in windows the scripts will be executed as if they were in the CMD with the command start
so you don’t have to wait:
start /B cmd /S /C "c:\php\php.exe -c c:\php\php.ini c:\documents\user\pasta\script1.php" > NUL
start /B cmd /S /C "c:\php\php.exe -c c:\php\php.ini c:\documents\user\pasta\script2.php" > NUL
If you are in a Unix-like environment you will run with to save the output to /dev/null
instead of returning to output
php -c /etc/php/php.ini /home/user/pasta/script1.php >/dev/null 2>&1
php -c /etc/php/php.ini /home/user/pasta/script2.php >/dev/null 2>&1
This will make your script that called the function processPhpScript
don’t have to wait for the answer.
Curl and fsockopen
I believe that when we speak of asynchronous in Guzzle in fact we are talking about external requests of which you do not need to wait for the answer because you do not want it or of multiple requests that work concurrently at the same time and are delivered as they end, Curl himself can do something like this:
<?php
// Inicia dois CURLs
$ch1 = curl_init("/");
$ch2 = curl_init("https://meta.pt.stackoverflow.com/");
//Esta parte é apenas devido ao SSL, é tudo apenas um exemplo
curl_setopt($ch1, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch2, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch1, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch2, CURLOPT_RETURNTRANSFER, 1);
//Inicia o manipulador e adiciona o curls
$mh = curl_multi_init();
curl_multi_add_handle($mh, $ch1);
curl_multi_add_handle($mh, $ch2);
//Executa as requisições simultaneamente
$running = null;
do {
curl_multi_exec($mh, $running);
} while ($running);
//Finaliza o manipulador
curl_multi_remove_handle($mh, $ch1);
curl_multi_remove_handle($mh, $ch2);
curl_multi_close($mh);
//Pega o conteúdo
$response1 = curl_multi_getcontent($ch1);
$response2 = curl_multi_getcontent($ch2);
//Exibe as respostas
echo $response1, PHP_EOL;
echo $response2, PHP_EOL;
Now if you look closely, we wanted two requests at once, yet to get the answer we had to wait for everything. I believe in Guzzle the requestAsync
should work with something like the Promisse, that what he succeeds in getting he sends for a reply in a callback, I don’t know much about Curl, maybe it is possible to check which item on curl_init
finished, but I’ll make a suggestion with fsockopen
, is just a basic example:
<?php
function createRequest($url, &$failno, &$failstr) {
$parsed = parse_url($url);
$isHttps = $parsed['scheme'] == 'https';
$host = ($isHttps ? 'ssl://' : '') . $parsed['host'];
$port = isset($parsed['port']) ? $parsed['port'] : ($isHttps ? 443 : 80);
$socket = fsockopen($host, $port, $errorno, $errorstr);
echo $host, $port;
if ($socket) {
$out = "GET " . $parsed['path'] . " HTTP/1.1\r\n";
$out .= "Host: " . $parsed['host'] . "\r\n";
$out .= "Connection: close\r\n\r\n";
fwrite($socket, $out);
return $socket;
}
return false;
}
function checkStatus(&$promisses, \Closure &$done) {
if (empty($promisses)) {
return false;
}
$nocomplete = false;
foreach ($promisses as &$promisse) {
if (feof($promisse['socket']) === false) {
$nocomplete = true;
$promisse['response'] .= fgets($promisse['socket'], 1024);
} else if ($promisse['complete'] === false) {
$promisse['complete'] = true;
$done($promisse['url'], $promisse['response']);
}
}
return $nocomplete;
}
function promisseRequests(array $urls, \Closure $done, \Closure $fail)
{
$promisses = array();
foreach ($urls as $url) {
$current = createRequest($url, $errorno, $errorstr);
if ($current) {
$promisses[] = array(
'complete' => false,
'response' => '',
'socket' => $current,
'url' => $url
);
} else {
$fail($url, $errorno, $errorstr);
}
}
$processing = true;
while ($processing) {
$processing = checkStatus($promisses, $done);
}
}
// Inicia dois CURLs
$urls = array(
'http://localhost/',
'http://localhost/inphinit/'
);
promisseRequests($urls, function ($url, $response) {
var_dump('Sucesso:', $url, $response);
}, function ($url, $errorno, $errorstr) {
var_dump('Falhou:', $url, $errorno, $errorstr);
});
In general what I did was to make the requisitions work at the same time, what I find interesting is that you can get the result that first finish and already manipulate as you wish, but I believe that it is not always useful.
Where Thread would be interesting
Both examples I quoted above are not threads, one is a process by part that avoids having to wait (since the call is totally owned) and the other is about multiple HTTP requests and the one that ends first calls the Closure
, but speaking of real threads (or next) the only place I see that maybe it would be interesting is with a PHP script that keeps running continuously, for example a CLI script as I mentioned or a Socket to work with Websocket (which by the way is also a CLI).
I will use as an example Websocket, a socket server handles multiple calls and it responds to the websocket only when you wish, imagining that 5 people connect to the socket through Websocket making orders it would be interesting to move orders to Threds and delivery-los only when finished, otherwise would have to process them to the extent that was requested and then the user who made a simple request ends up having to wait for the users who made longer requests to process, with Thread this could improve a little, helping to work the competition (of course if the script is well written).
As soon as possible I will post an example with Websocket
Installing Pthreads
It can be installed via PECL, using the command:
pecl install pthreads
But if you don’t have Pecl you can try and using Windows you can download the binaries here http://windows.php.net/downloads/pecl/releases/pthreads/, but if you use Mac OSX, Linux or do not have the binary for your PHP version and do not have PECL, then you will have to compile after downloading from https://github.com/krakjoe/pthreads, of course the PHP executable has to have been compiled on your machine as well and by the same compiler (I won’t go into detail as this is not the focus of the question)
Please do not ask to change language. I am already thinking about the possibility, but it is not yet feasible. kkk
– Wallace Maxters
C#, Java, C++, Harbour, etc.
– Maniero
The only "asynchronous" thing in PHP that comes to mind is the HTTP client implementation of the Guzzle people. They provide an HTTP client similar to Restangular (Javascript) that supports asynchronous calls and Promises. If I don’t miss the memory they use
Message Queue
or something like that. Maybe it’s worth taking a look.– nmindz
@nmindz no, Guzzle uses Curl. And he’s asynchronous between requests. Anyway, the script needs to wait for the end of the process, which may fail with the
set_time_limit
, in my opinion.– Wallace Maxters
Either way, asynchronous requests can be made via
fsockopen
, but it doesn’t solve my problem. It doesn’t serve to send email or log logs. Sad is to think of using it to do tricks, but it’s what we have :\– Wallace Maxters
@Wallacemaxters got it. I was looking this and I thought maybe I could help. Well, good luck!
– nmindz
@I use it in a system. It’s great, because Curl does a magic where you don’t have to wait for 50 requests, it makes a kind of simultaneous request that returns at different times, but contact the execution time of the script (that is, while this queue doesn’t end, the script keeps running).
– Wallace Maxters
https://stackoverflow.com/questions/13846192/php-threading-call-to-a-php-function-asynchronously
– Don't Panic
For those who did not understand the problem of PHP is that it is a language of script, every time you try to do something complex you will have difficulties. It is a case of using hammer for a screw. https://en.wikipedia.org/wiki/Law_of_the_instrument.
– Maniero
Guys, I think I’m confusing things. I asked this question here. Maybe I’ve talked nonsense here :\
– Wallace Maxters