Download entire website for offline viewing

$ wget --mirror -p --convert-links -P ./<LOCAL-DIR> <WEBSITE-URL>
?mirror : turn on options suitable for mirroring. -p : download all files that are necessary to properly display a given HTML page. ?convert-links : after the download, convert the links in document for local viewing. -P ./LOCAL-DIR : save all the files and directories to the specified directory.

By: tkembo
2011-08-18 08:27:28

What Others Think

`--mirror' => `-m' and `--convert-links' => `-k'.
h3xx · 572 weeks and 5 days ago
use this: ;)
btwotch · 572 weeks and 4 days ago
You can learn so many such Unix commands through this website. This is the best source for students who wish to CBD Extraction learn such commands. The above command is used to download an entire website for offline viewing. I am looking here for more updates on that. Keep up the good work.
Alyssalauren · 10 weeks and 5 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: