Last Updated: August 23, 2016
· kevgathuku

How to download a website for local viewing via wget

GNU wget wins me over again and again due to its sheer effectiveness compared to other download managers.

Let's say we have a case where we want to download a website for local viewing.

Here's how to do it with the wget command

wget --mirror --convert-links --backup-converted --timestamping --adjust-extension --wait=5 --limit-rate=20k

Let's go through that step by step:


This makes wget turn on some background commands optimal for copying remote websites


Basically does exactly what it says, convert links to local links to enable offline viewing


Tells wget to back up the original HTML files before the conversion


Do not overwrite the files already downloaded if they have not changed on the server. This is especially useful if the download is interrupted for some reason.


Save HTML/CSS documents with proper extensions. Useful for URLs such as which are saved with appropriate HTML extensions


Limit the download speed to the specified rate


Wait the specified amount of time before requests. Be nice and lighten the load on the server.

So that's it.

A shorter equivalent would be:

wget -m -k -K -N -E -w  5 --limit-rate=20k

For more consult the manuals, available online at

Happy downloading!!

3 Responses
Add your response

I agree wget is pretty equivalent to the knees of a bee, but what's the alternative in the CLI world, anyway? I'm not sure I've even heard of another tool that does this kind of thing.

over 1 year ago ·

Yeah.. You're right. Haven't even heard of anything that comes close. I was comparing it to graphical download managers which I think I relied on for far too long

over 1 year ago ·

SiteSucker, if you need a client to do the same.

over 1 year ago ·