GNU wget wins me over again and again due to its sheer effectiveness compared to other download managers.
Let's say we have a case where we want to download a website for local viewing.
Here's how to do it with the wget command
wget --mirror --convert-links --backup-converted --timestamping --adjust-extension --wait=5 --limit-rate=20k http://www.example.com
Let's go through that step by step:
This makes wget turn on some background commands optimal for copying remote websites
Basically does exactly what it says, convert links to local links to enable offline viewing
Tells wget to back up the original HTML files before the conversion
Do not overwrite the files already downloaded if they have not changed on the server. This is especially useful if the download is interrupted for some reason.
Save HTML/CSS documents with proper extensions. Useful for URLs such as http://example.com/category/ which are saved with appropriate HTML extensions
Limit the download speed to the specified rate
Wait the specified amount of time before requests. Be nice and lighten the load on the server.
So that's it.
A shorter equivalent would be:
wget -m -k -K -N -E -w 5 --limit-rate=20k http://www.example.com
For more consult the manuals, available online at http://www.gnu.org/software/wget/manual/wget.html