How to download multiple files using wget?

Asked

Viewed 2,984 times

0

Guys, I need to download all images from a folder on a server!

I mean, there’s a website www.tarararara.com/images

And there are 50 images! How do I download these images and put them in a folder of my server using the command wget?

And I need to download all the images from the site, to put in the folder images localhost!

Any way to create a loop? Or something like?

  • has access to ftp or is a third party website?

  • third-party website,,

  • For third party site there is nothing to do unless the images have some pattern. The thing there is to use things like wget, file_get_contents(), fopen, Curl, etc....

  • so they kind of have a pattern for the names, but like I said, you can’t use wget, Vertrigo use on Windows!! and file_get_contents da to put in a folder?

  • After all, you want to do it in php, right? You can use any of these php functions that stream and read urls. As for "wget" is a feature for linux but has native equivalent in windows or even may have a wget in Win32/64. Play in google "windows wget".

  • I just want to make a quick script to download the images in the folder and that’s it, delete it dps, get it? and say I find a wget for windows, what would be the command to get everything from the server?

  • even with wget it still works.. you still have to order one by one in a loop.. Tried doing something in php? Post on the question what you tried to do so far

  • Nothing, because I know that some things are kind of impossible in this Windows!

  • It has nothing to do with windows.. Set what you need. Want to do with wget or php? For now it is not clear which tool or environment you want to use.

  • At first it was in PHP, since I use windows! But then I was recommended to install a package to simulate a Linux terminal! Ai I thought now that I can use wget, but still gave error, have to download all images, in the same race! But now I already know how to do with wget! Question solved!

Show 6 more comments

2 answers

3


wget -nd -r -P /onde.vc.quer.salvar -A jpeg,jpg,bmp,gif,png http://odominio/imagem
  • The -np prevents recursiveness from being used toward the directory Father, if you don’t put that he’ll try to do download of all the images that are on the website.
  • The -r allows recursiveness to the children directory.
  • The -P will create a directory where will be placed all images.
  • The -A indicates all type of file you want.

The possibilities are great and the manual explains very well everything he can do:

https://www.gnu.org/software/wget/manual/wget.html

  • Dude, I’m using the sed, I don’t know if you know, but it n works really well the wget, I did the worst thing in my life

  • i opened 100000 tabs of all images

  • and I dragged them all to my briefcase, it was about 80 images kk

  • but next time I know how to

1

The syntax to do this with the wget the Marcos Célio already answered, how the question is marked with the tag , the answer is based on that language, you can try the following:

  1. Download the contents of the com page file_get_contents or cURL, or other way you know.
  2. Extract the links page, you can use the function preg_match or parse HTML with DOMDocument.
  3. Download the file from a URL, you can use the file_put_contents or cURL together with the function fopen to open the file for writing.

Do the following:

  1. To download the content of the page, with cURL:

    function obterPagina($url) {
        $curl = curl_init();
        curl_setopt($curl, CURLOPT_URL, $url);
        curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
    
        return curl_exec($curl);
    }
    

    Note: You can include more options depending on the need. More information.

  2. To extract the links of the page, with DOMDocument:

    function obterLinks($url, $pagina, $extensoes = ['gif', 'jpg']) { // Extensões aceitas
        $dom = new DOMDocument;
        $links = [];
    
        if ($dom->loadHTML($pagina) !== false) {
            foreach ($dom->getElementsByTagName('a') as $link) { // Percorre todos os elementos com a tag "a"
                $href = $link->getAttribute('href');
                $extensao = pathinfo($href, PATHINFO_EXTENSION);
    
                if (in_array($extensao, $extensoes)) {
                    $links[] = $url . $href;
                }
            }
            return $links;
        }
        return false;
    }
    
  3. To download the file, also with cURL:

    function baixarArquivo($url, $salvarComo, $timeout = 3600) {
        $curl = curl_init(); 
        $fp = fopen($salvarComo, 'w'); // Abre o arquivo para escrita
    
        if (!$fp)
            return false;
    
        $opts = array(CURLOPT_URL     => $url,
                      CURLOPT_FILE    => $fp,
                      CURLOPT_TIMEOUT => $timeout); // Define o timeout, o padrão é 1 hora
    
        curl_setopt_array($curl, $opts);
    
        $ret = curl_exec($curl);
        curl_close($curl);
        fclose($fp);
    
        return $ret !== false;
    }
    

To use, do so:

$url = "http://www.tarararara.com/images/";
$pagina = obterPagina($url);

if ($pagina) {
    $links  = obterLinks($url, $pagina);

    if ($links) {
        foreach ($links as $link) {
            var_dump( baixarArquivo($link, basename($link)) ); // Salva na mesma pasta do script
        }
    }
}

Browser other questions tagged

You are not signed in. Login or sign up in order to post.