Téléchargeur HTTP multithread


3

Je recherche un outil de ligne de commande capable de télécharger plusieurs URL avec plusieurs threads, par exemple

wget2 -n 5 http://stackoverflow.com/ http://askubuntu.com/ http://bobo.com/

-n = number of threads.Je suis tombé sur Axel, mais lorsque je lui donne plusieurs URL, il n'en télécharge qu'une.

Je vais télécharger des fichiers HTML.

8

Aria2 is the best solution for this if you want CLI. Aria2 supports multiple connections, multiple threads and multiple sources.

Another benefit of Aria2 is that is works as a plugin for uGet so you can use the power of Aria2 with a nice easy to use GUI.

Aria2 - CLI - http://aria2.sourceforge.net/

uGet - GUI - http://ugetdm.com

  • multiple connections is adjustable in GUI when adding a download.

Update: based on OP's batch needs

uGet supports batch downloads via .txt, .html, clipboard and many more methods. While admittedly not CLI, I think it solves the problem quite well. I created a video tutorial to explain the various methods, the GUI has changed since this recording but the functionality is still relevant.


1

All of the above and linked suggestions do not take two unique URLs. They only take URLs that are mirrors of the same file.

I've found a few programs that do this:

The best is puf (apt-get install puf), use puf url1 url2 etc.

Then there is HTTRACK, which requires a lot of tinkerings and has some limites I can't get past (speed and connection limits)

DownThemAll for Firefox is very good if you don't need a command line app.

UPDATE

I've since found puf has a tendency to crash. The best solution is to create a .txt file with URLs on new lines, e.g.

http://google.com/
http://yahoo.com/

Save that are urls.txt (for example) then run the command:

cat urls.txt | xargs -n 1 -P 10 wget -q

-n specifies to select each line from the file

-p specifies the number of URLs you would like to download in parallel.