Multi Thread wget
First of all, I tried to search for some similar tip around here but since I can't manage to find the search option and did not found this on the mais page for shell/linux, I am posting it.
So, have you ever wanted to multi-thread a wget download to make it faster? I know about aget but for some reason it doesn't work well on Mac Os X, so I came up with this:
for i in {1..3}; do wget -rnp -c -N [url]&; done;
This will start three instances of wget that will be "linked" and all download the same file. The options responsible for these are -rnp and -N. The -c option is there to "continue" the download in case something fails, like a drop on your connection or a blackout, than you just run you command again and it will continue from where it stopped :)
Hope it helps someone.
Written by Filipe Kiss
Related protips
5 Responses
Muito boa!
Check also axel http://alioth.debian.org/projects/axel/
What is the &
in [url]&
for?
it's very helpful, thanks a lot.