To understand why this is the equivalent of "find -L /path/to/search -type l, see http://ynform.org/w/Pub/FindBrokenSymbolicLinks or look at http://www.gnu.org/software/findutils/manual/html_mono/find.html
This will visit recursively all linked urls starting from the specified URL. It won't save anything locally and it will produce a detailed log. Useful to find broken links in your site. It ignores robots.txt, so just use it on a site you own!
This command crawls a domain with the typical WGET output. It logs every thing to a WGET-LOG file with any errors repeated at the end. It also had the added benefit of not flooding your terminal without ouput, so it is safe to run in the background.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: