What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.

Top Tags



Get all links of a website

Terminal - Get all links of a website
lynx -dump http://example.com/ | awk '/http/{print $2}' | sort -u
2011-10-13 09:49:36
User: mathias
Functions: awk sort
Get all links of a website

This will get all links from a given URL, remove any duplicates, and output the result.


There is 1 alternative - vote for the best!

Terminal - Alternatives
lynx -dump http://www.domain.com | awk '/http/{print $2}'
dog --links "http://www.domain.com"
lynx -dump http://www.domain.com | awk '/http/{print $2}' | egrep "^https{0,1}"
lynx -dump http://www.domain.com | awk '/http/{print $2}' | egrep "^https{0,1}"
lynx -dump http://www.cooks4arab.com | awk '/http/{print $2}' | egrep "^https{0,1}"

Know a better way?

If you can do better, submit your command here.

What others think

Got a error with the

sort -u

bit, as follows:

qsort: string comparison failed: Illegal byte sequence

sort: Set LC_ALL='C' to work around the problem.

sort: The strings compared were `\273' and `[1486]http://www.catonmat.net/blog/christmas-tree-in-the-shell/'.

I removed the 'sort -u' and send the output to a file. Then I looked at the file and this is what I saw:

=== START ===














=== END ===

Any explanation?

Comment by fukr 184 weeks and 2 days ago

Ok, so some of the output between the START and END tags was removed.

there are two lines with <BB> immediately prior to the line that starts with [1486].

Comment by fukr 184 weeks and 2 days ago

works with elinks, too

Comment by ctapyxa 180 weeks and 5 days ago

Your point of view

You must be signed in to comment.