What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Get all files of particular type (say, PDF) listed on some wegpage (say, example.com)

Terminal - Get all files of particular type (say, PDF) listed on some wegpage (say, example.com)
wget -r -A .pdf -l 5 -nH --no-parent http://example.com
2011-06-09 17:17:03
User: houghi
Functions: wget
Get all files of particular type (say, PDF) listed on some wegpage (say, example.com)

See man wget if you want linked files and not only those hosted on the website.


There is 1 alternative - vote for the best!

Terminal - Alternatives
curl -s http://example.com | grep -o -P "<a.*href.*>" | grep -o "http.*.pdf" | xargs -d"\n" -n1 wget -c
2011-06-09 14:42:46
User: b_t
Functions: grep wget xargs

This example command fetches 'example.com' webpage and then fetches+saves all PDF files listed (linked to) on that webpage.

[*Note: of course there are no PDFs on example.com. This is just an example]

Know a better way?

If you can do better, submit your command here.

What others think

To make it easier to understand:

wget --recursive --accept .pdf --level=5 --no-host-directories --no-parent

Is it necessary to have 5 levels of recursion?

Comment by Mozai 352 weeks and 6 days ago

Get all files of particular type (say, mp3) listed on some web page (say, audio.org):

wget -c -r --no-parent -A .mp3 http://audio.org/mp3s/
Comment by bhishma 63 weeks ago

Your point of view

You must be signed in to comment.