How to download a website for local viewing via wget
GNU wget wins me over again and again due to its sheer effectiveness compared to other download managers.
Let's say we have a case where we want to download a website for local viewing.
Here's how to do it with the wget command
wget --mirror --convert-links --backup-converted --timestamping --adjust-extension --wait=5 --limit-rate=20k http://www.example.com
Let's go through that step by step:
--mirror
This makes wget turn on some background commands optimal for copying remote websites
--convert-links
Basically does exactly what it says, convert links to local links to enable offline viewing
--backup-converted
Tells wget to back up the original HTML files before the conversion
--timestamping
Do not overwrite the files already downloaded if they have not changed on the server. This is especially useful if the download is interrupted for some reason.
--adjust-extension
Save HTML/CSS documents with proper extensions. Useful for URLs such as http://example.com/category/ which are saved with appropriate HTML extensions
--limit-rate=20k
Limit the download speed to the specified rate
--wait=seconds
Wait the specified amount of time before requests. Be nice and lighten the load on the server.
So that's it.
A shorter equivalent would be:
wget -m -k -K -N -E -w 5 --limit-rate=20k http://www.example.com
For more consult the manuals, available online at http://www.gnu.org/software/wget/manual/wget.html
Happy downloading!!
Written by Kevin Ndung'u
Related protips
3 Responses
I agree wget is pretty equivalent to the knees of a bee, but what's the alternative in the CLI world, anyway? I'm not sure I've even heard of another tool that does this kind of thing.
Yeah.. You're right. Haven't even heard of anything that comes close. I was comparing it to graphical download managers which I think I relied on for far too long
SiteSucker, if you need a client to do the same.