What can cause file_get_contents to give "timeout" error?

Asked

Viewed 2,856 times

5

I asked that question here, but hardly, because it is a problem that seems to be specific, the solution would be something that would be something generic.

But it occurred to me in this question I asked is that I’m having problems with the function file_get_contents.

When I make a request through it, always an error is returned:

file_get_contents('https://getcomposer.org/versions') 

The return is:

PHP warning: file_get_contents(https://getcomposer.org/versions): failed to open stream: Connection timed out on line 1

However, when I open this url in the browser, everything works perfectly. What’s more, if I try to request the url https://www.google.com the same problem occurs.

But the strange thing is that it’s not all the urls that generate this problem.

If I do this, it works:

 file_get_contents('/')

I thought at first the problem was with the https urls, but that doesn’t seem to be it.

If I use the curl works, but I don’t want to use it, because I depend exclusively on the file_get_contents on that occasion.

What can do with what the file_get_contents gives timeout error on some specific urls, and they normally open from the browser?


Observing: I wouldn’t like answers like "using the Curl as an alternative solution, "since I really need to use file_get_contents to that end (in addition, a response with curl would not answer my question).


  • Here the file_get_contents no problem. Make a CURL.

  • Dude, I told you. Look at max_execution_time. And another parameter too: default_socket_timeout.

  • As for your specific example echo file_get_contents('https://getcomposer.org/versions'); works well here and with google the same thing, but I’m looking for some service where this error occurs to see if I can answer your question

  • Damn it, I figured out what the problem is, the answer might hold for the two questions, kkkkkk

  • Aff, I found nothing!

  • 1

    Where are you trying to use this code? If possible share the snippet of the code that is giving this problem.

  • @Carlosfernandes independent of the stretch, happens this.

  • This problem is occurring on the local or remote server?

  • I’m using in Local, on the command line.

  • Emulate a virtual machine and try to use the file_get_contents. From what I read here, there is a great possibility of being a problem in the network or in the configuration of PHP.

Show 5 more comments

3 answers

1

There are four possibilities:

  1. Firewall or any blocking or proxy software configured on your machine or network (somewhat difficult to say)

  2. As of version 5.6, PHP works completely differently with SSL (http://php.net/manual/en/migration56.openssl.php), requiring prior configuration of the stream to work the certificates better, when necessary.

    I understand you said that if you access the https://answall.com it works, but the point is that the certificate can work well for a website and maybe to another no, this not because its failure necessarily, but perhaps because there is some problem in the certificate of getcomposer or your machine, either way you can take the real proof so:

    $arrContextOptions = array(
        'ssl' => array(
            'verify_peer' => false,
            'verify_peer_name' => false
        )
    );  
    
    $response = file_get_contents('https://getcomposer.org/versions', false, stream_context_create($arrContextOptions));
    
    echo $response;
    

    If it works is problem in the certificates, maybe of your machine.

  3. DNS using on your internet provider (I will call only ISP to make it easier) or the ISP itself, yes this is complicated, for example in the company where I work the ISP fails on some site, even the company site fails or is too slow to open, but the moment I use 4G the site loads normally, this has no way to be solved locally, until today I have problems where I work. In short, it may be that the Composer server conflicts/fails precisely with your ISP.


Of course personally the problem may be really the secure connection certificates that fail precisely with the Composer server, even I have seen, I had similar problems, with the npm

What you can do to fix HTTPS

Can you turn that down http://curl.haxx.se/ca/cacert.pem, can use wget

$ wget http://curl.haxx.se/ca/cacert.pem

And then point in PHP.INI like this (available since php5.6):

openssl.cafile=/path/cacert.pem

Restart Apache (if you are apache).

Very important note

In Linux distributions generally the php.ini for CLI is different, the composer wheel using CLI, CLI php.ini is usually in the folder:

/etc/php/5.6/cli/php.ini

Then you will have to edit both the /etc/php/5.6/cli/php.ini how much php.ini used by Apache and both add openssl.cafile=/path/cacert.pem

In Apache it must be something like /etc/php5/apache2/php.ini

In older versions of Ubuntu folders like /etc/php/5.4/, /etc/php/5.5/ and /etc/php/5.6/ are exchanged for /etc/php5/

0

$streamContext = stream_context_create(array(
    'http' => array(
        'method' => 'GET',
        'timeout' => 30
     )
));

Where the key timeout is in seconds.

Afterward:

file_get_contents('https://getcomposer.org/versions', 0, $streamContext);
  • I’m testing it here. It’s still stuck! The weird thing is that the page is tiny...

  • Your PHP is version 6 or higher ?

  • 1

    Try to define a User-Agent also @Wallacemaxters, http://answall.com/questions/164682/php-n%C3%A3o-can-copy-file-remote-but-download-by-browser

  • Yes, it’s PHP 7. The strange thing is that I disconnected and connected the internet, and the url was accessible again. Now I want to go back to having the problem to find out what was wrong!

  • I cleared the DNS cache and added the Composer certificate there (only this second didn’t change much). I’m trying to figure out how the network can affect so much that one url works and another one doesn’t. Here it’s not blocked!

  • @Wallacemaxters, will IP problem? Changed the IP when reconnected?

  • @No. The problem is solved when I reconnect. But when I run the Poser (which makes a lot of requests), it gets "blocked" again after a while.

  • There’s squid all the time

Show 3 more comments

0

Start by trying to include www.

You can try too :ini_set('default_socket_timeout', 900);(or by the Zooboomafoo response script) Increase the time limit that by default is 60 segundos.

Make sure your server can access external resources if there is no firewall restriction.

And you can try to disable Ipv6 or configure the Socket context with bindto

If this does not resolve, use Curl.. (Like! ;))

  • 1

    The problem of increasing the time is that it doesn’t solve it. It stays longer still trying to access the url, then generates that Warning there. I’m checking if it’s not a problem in the network, because after connecting and disconnecting, returned to normal (° °)

  • You’ve tried every option I’ve suggested ?

Browser other questions tagged

You are not signed in. Login or sign up in order to post.