I\'m using wget to download website content, but wget downloads the files one by one.
How can I make wget download using 4 simultaneous connections?
Wget does not support multiple socket connections in order to speed up download of files.
I think we can do a bit better than gmarian answer.
The correct way is to use aria2.
aria2c -x 16 -s 16 [url]
# | |
# | |
# | |
# ---------> the number of connections here
use the aria2 :
aria2c -x 16 [url]
# |
# |
# |
# ----> the number of connections
http://aria2.sourceforge.net
I love it !!
I strongly suggest to use httrack.
ex: httrack -v -w http://example.com/
It will do a mirror with 8 simultaneous connections as default. Httrack has a tons of options where to play. Have a look.
try pcurl
http://sourceforge.net/projects/pcurl/
uses curl instead of wget, downloads in 10 segments in parallel.
Call Wget for each link and set it to run in background.
I tried this Python code
with open('links.txt', 'r')as f1: # Opens links.txt file with read mode
list_1 = f1.read().splitlines() # Get every line in links.txt
for i in list_1: # Iteration over each link
!wget "$i" -bq # Call wget with background mode
Parameters :
b - Run in Background
q - Quiet mode (No Output)
Another program that can do this is axel
.
axel -n <NUMBER_OF_CONNECTIONS> URL
For baisic HTTP Auth,
axel -n <NUMBER_OF_CONNECTIONS> "user:password@https://domain.tld/path/file.ext"
Ubuntu man page.