What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



All commands from sorted by
Terminal - All commands - 12,449 results
URL=www.example.com && wget -rq --spider --force-html "http://$URL" && find $URL -type d > url-list.txt && rm -rf $URL
2009-01-27 17:59:08
User: root
Functions: find rm wget

This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.

wget -r -l1 --no-parent -nH -nd -P/tmp -A".gif,.jpg" http://example.com/images
2009-01-27 17:31:22
User: root
Functions: wget

This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.

watch -n 30 uptime
2009-01-27 14:49:21
User: root
Functions: watch

This runs the uptime command every 30 seconds to avoid an SSH connection dropping due to inactivity. Granted there are better ways of solving this problem but this is sometimes the right tool for the job.

svn add --force *
2009-01-27 10:53:27
User: root

The --force option bypasses the warning if files are already in SVN.

ssh [email protected] "ps aux | grep httpd | wc -l"
2009-01-27 00:46:17
User: root
Functions: ssh

This counts the number of httpd processes running.

find /path/to/dir -type f -exec grep \-H "search term" {} \;
2009-01-26 16:32:14
User: root
Functions: find grep

Simple use of find and grep to recursively search a directory for files that contain a certain term.

zip -r myfile.zip * -x \*.svn\*
rsync -av -e ssh [email protected]:/path/to/file.txt .
2009-01-26 13:39:24
User: root
Functions: rsync

You will be prompted for a password unless you have your public keys set-up.

2009-01-26 13:25:37
User: root

Really useful for when you have a typo in a previous command. Also, arguments default to empty so if you accidentally run:

echo "no typozs"

you can correct it with

cp file.txt{,.bak}
2009-01-26 12:11:29
User: root
Functions: cp

Uses shell expansion to create a back-up called file.txt.bak

grep -o "\(new \(\w\+\)\|\w\+::\)" file.php | sed 's/new \|:://' | sort | uniq -c | sort
2009-01-26 12:08:47
User: root
Functions: grep sed sort uniq

This grabs all lines that make an instantation or static call, then filters out the cruft and displays a summary of each class called and the frequency.

sed '1000000!d;q' < massive-log-file.log
2009-01-26 11:50:00
User: root
Functions: sed

Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using

sed '999995,1000005!d' < my_massive_file
find /path/to/dir -type f -print0 | xargs -0 rm
2009-01-26 11:30:47
User: root
Functions: find xargs

Using xargs is better than:

find /path/to/dir -type f -exec rm \-f {} \;

as the -exec switch uses a separate process for each remove. xargs splits the streamed files into more managable subsets so less processes are required.

find . -name "*.php" -exec grep \-H "new filter_" {} \;
2009-01-26 10:43:09
User: root
Functions: find grep

This greps all PHP files for a given classname and displays both the file and the usage.

sudo !!
2009-01-26 10:26:48
User: root

Useful when you forget to use sudo for a command. "!!" grabs the last run command.

alias cr='find . 2>/dev/null -regex '\''.*\.\(c\|cpp\|pc\|h\|hpp\|cc\)$'\'' | xargs grep --color=always -ni -C2'
2009-01-26 08:54:25
User: chrisdrew
Functions: alias grep xargs

Creates a command alias ('cr' in the above example) that searches the contents of files matching a set of file extensions (C & C++ source-code in the above example) recursively within the current directory. Search configured to be in colour, ignore-case, show line numbers and show 4 lines of context. Put in shell initialisation file of your choice. Trivially easy to use, e.g:

cr sha1_init
du | sort -gr > file_sizes
2009-01-26 01:12:54
User: chrisdrew
Functions: du sort

Recursively searches current directory and outputs sorted list of each directory's disk usage to a text file.

watch "df | grep /path/to/drive"
2009-01-25 21:16:41
User: root

This can be useful when a large remove operation is taking place.

echo "ls -l" | at midnight
2009-01-25 21:07:42
User: root
Functions: at echo

This is an alternative to cron which allows a one-off task to be scheduled for a certain time.

tail -10000 access_log | awk '{print $1}' | sort | uniq -c | sort -n | tail
2009-01-25 21:01:52
User: root
Functions: awk sort tail uniq

This uses awk to grab the IP address from each request and then sorts and summarises the top 10.

find . \( -name "*.php" -o -name "*.js" \) -exec svn propset svn:keywords Id {} \;
find . -type f | wc -l
ps aux | sort -nk +4 | tail
2009-01-23 17:12:33
User: root
Functions: ps sort

ps returns all running processes which are then sorted by the 4th field in numerical order and the top 10 are sent to STDOUT.

myisamchk /path/to/mysql/files/*.MYI
2009-01-22 10:20:00
User: root

See http://dev.mysql.com/doc/refman/5.0/en/myisamchk.html for further details. You can also repair all tables by running:

myisamchk -r *.MYI