What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,590 results
find . -type f ! -perm /g=r -exec chmod g+r {} +
2009-06-17 13:39:59
User: sanmiguel
Functions: chmod find
Tags: find chmod

Makes any files in the current directory (and any sub-directories) group-readable.

Using the "! -perm /g=r" limits the number of files to only those that do not already have this property

Using "+" on the end of the -exec body tells find to build the entire command by appending all matching files before execution, so invokes chmod once only, not once per file.

alias launchpadkey="sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys"
2009-06-17 12:02:27
User: azeey
Functions: alias
Tags: alias apt-key

Makes it easy to add keys to new ppa sources entries in apt sources.list

Now to add the key for the chromium-daily ppa:

launchpadkey 4E5E17B5
find . -type f -name *.ext -exec cat {} > file.txt \;
2009-06-17 11:33:14
User: realgt
Functions: cat find

Useful if you have to put together multiple files into one and they are scattered across subdirectories. For example: You need to combine all .sql files into one .sql file that would be sent to DBAs as a batch script.

You do get a warning if you create a file by the same extension as the ones your searching for.

find . -type f -name *.sql -exec cat {} > BatchFile.txt \;

<<<"k=1024; m=k*k; g=k*m; g" bc
2009-06-17 10:35:10
User: mpb

There are two ways to use "here documents" with bash to fill stdin:

The following example shows use with the "bc" command.

a) Using a delimiter at the end of data:

less-than less-than eeooff bc

> k=1024

> m=k*k

> g=k*m

> g

> eeooff


b) using the "inline" verion with three less-than symbols:

less-than less-than less-than "k=1024; m=k*k; g=k*m; g" bc


One nice advantage of using the triple less-than version is that the command can easily be recalled

from command line history and re-executed.

PS: in this "description", I had to use the name "less-than" to represent the less-than symbol because the commandlinefu input text box seems to eat up the real less-than symbols. Odd.

for x in `seq -w 1 30`; do sar -b -f /var/log/sa/sa$x | gawk '/Average/ {print $2}'; done
grep -h -o '<[^/!?][^ >]*' * | sort -u | cut -c2-
2009-06-17 00:22:18
User: thebodzio
Functions: cut grep sort
Tags: sort grep cut

This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

paste -d ',:' file1 file2 file3
2009-06-17 00:11:04
User: thebodzio
Functions: paste
Tags: paste

In the above example all files have 4 lines. In "file1" consecutive lines are: "num, 1, 2, 3", in "file2": "name, Jack, Jim, Frank" and in "file3": "scores, 1300, 1100, 980". This one liner can save considerate ammount of time when you're trying to process serious portions of data. "-d" option allows one to set series of characters to be used as separators between data originating from given files.

grep -Eio '([[:alnum:]_.]+@[[:alnum:]_]+?\.[[:alpha:].]{2,6})' file.html
2009-06-16 20:19:47
User: wires
Functions: grep

find all email addresses in a file, printing each match. Addresses do not have to be alone on a line etc. For example you can grab them from HTML-formatted emails or CSV files, etc. Use a combination of


to filter them.

ruby -i.bkp -pe "gsub(/search/, 'replace')" *.php
2009-06-16 12:35:40
User: gustavgans

Search for the string "search" and replace it with the string "replace", on all files with the extension php in the curret folder. Do also a backup of each file with the extension "bkp".

perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
2009-06-16 06:39:08
User: obscurite
Functions: grep perl split

Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

opendiff <file1> <file2>
2009-06-16 03:22:52
User: claytron

This command will open up the two files in FileMerge on OS X. You can also compare two directories.

opendiff directory1 directory2

NOTE: FileMerge is a part of the OS X Developer Tools, available on the install disc.

sort -k1.x
2009-06-16 00:04:21
User: leper421
Functions: sort

Tells sort to ignore all characters before the Xth position in the first field per line. If you have a list of items one per line and want to ignore the first two characters for sorting purposes, you would type "sort -k1.3". Change the "1" to change the field being sorted. The decimal value is the offset in the specified field to sort by.

(uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -s "Pandaren!" someone@cmdfu.com
2009-06-15 11:34:51
User: LrdShaper
Functions: mailx uuencode

If you're users have ever asked your script to email their reports in separate attachments instead of tar'ring them into one file, then you can use this. You'll need the mailx package of course. In Unix you'd want to add an additional parameter "-m"

(uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -m -s "Hooosa!" someone@cmdfu.com

killall -2 mpg321
2009-06-15 03:04:00
User: dattaway
Functions: killall

Uses process signal to play next selection

2009-06-14 21:17:40
User: Josay
Tags: completion ESC

Pressing ESC then * will insert in the command line the results of the autocompletion.

It's hard to explain but if you look the sample output or do

echo ESC *

you will understand quickly.

By the way, few reminders about ESC :

- Hold ESC does the same thing as tab tab

- 'ESC .' inserts the last argument of last command (can be done many times in order to get the last argument of all previous commands)

(Command too long..See sample Output..)
2009-06-14 20:34:37
User: mohan43u
Tags: bash sed echo tr od

curl doesn't provide url-encoding for 'GET' data, it have an option '--data-urlencode', but its only for 'POST' data. Thats why I need to write down this commandline. With 'perl', 'php' and 'python', this is one liner, but just I wrote it for fun. Works in Ubuntu, will work in all linux varients(I hope it will work in unix varients also).

sed -e "s/\[{/\n/g" -e "s/}, {/\n/g" sessionstore.js | grep url | awk -F"," '{ print $1 }'| sed -e "s/url:\"\([^\"]*\)\"/\1/g" -e "/^about:blank/d" > session_urls.txt
2009-06-14 15:08:31
User: birnam
Functions: awk grep sed

This will extract all of the urls from a firefox session (including urls in a tab's history). The sessionstore.js file is in ~/.mozilla/firefox/{firefox profile}

curl -u <user>:<password> -d status="Amarok, now playing: $(dcop amarok default nowPlaying)" http://twitter.com/statuses/update.json
2009-06-14 02:42:34
User: caiosba

Share your "now playing" Amarok song in twitter!

svn diff <file> | vim -R -
2009-06-13 22:00:49
User: caiosba
Functions: diff vim
Tags: svn vim diff color

Simple way to achieve a colored SVN diff

mysql -uadmin -p` cat /etc/psa/.psa.shadow` -Dpsa -e"select mail_name,name,password from mail left join domains on mail.dom_id = domains.id inner join accounts where mail.account_id = accounts.id;"
find `echo "${PATH}" | tr ':' ' '` -type f | while read COMMAND; do man -f "${COMMAND##*/}"; done
2009-06-13 19:56:24
User: mohan43u
Functions: find man read tr
Tags: man find read while tr

Obviously, you can replace 'man' command with any command in this command line to do useful things. I just want to mention that there is a way to list all the commands which you can execute directly without giving fullpath.

Normally all important commands will be placed in your PATH directories. This commandline uses that variable to get commands. Works in Ubuntu, will work in all 'manpage' configured *nix systems.

man ls | col -b > ~/Desktop/man_ls.txt
2009-06-13 11:49:33
User: vigo
Functions: col ls man

You can convert any UNIX man page to .txt

unrar e file.part1.rar; if [ $? -eq 0 ]; then rm file.part*.rar; fi
2009-06-13 11:11:43
User: mrttlemonde
Functions: rm

It's also possible to delay the extraction (echo "unrar e ... fi" |at now+20 minutes) wich is really convenient!

a=`printf "%*s" 16`;b=${a//?/{0..1\}}; echo `eval "echo $b"`
chronic () { t=$1; shift; while true; do $@; sleep $t; done & }
2009-06-13 05:57:54
User: rhythmx
Functions: sleep

Chronic Bash function:

chronic 3600 time # Print the time in your shell every hour chronic 60 updatedb > /dev/null # update slocate every minute

Note: use 'jobs' to list background tasks and fg/bg to take control of them.