What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,926 results
aureport -x
2009-05-06 11:42:12

Aureport is a tool for displaying auditd system log. -x options cause to display launched executable on system.

Aureport work with auditd so auditd must be installed an running on a system.

Tested on CentOS / Debian

convert {$file_in} \( +clone -background black -shadow 60x5+10+10 \) +swap -background none -layers merge +repage {$file_out}
2009-05-06 10:19:39
User: kureikain
Functions: merge

Please take notice that if you are going to use an JPG file for shadow effect,

let change -background none to -background white!

Because -background none make a transparent effect while JPG doesn't support transparent! And when viewing, you will get a bacl box!

So we will use an white background under! We can use other color as well!

convert -rotate $rotate -scale $Widthx$Height -modulate $brightness -contrast $contrast -colorize $red%,$green%,$blue% $filter file_in.png file_out.png
2009-05-06 10:14:22
User: kureikain
rotate: the rotate angle width, $height: width and height to scale to birghtness: change brighness
2009-05-06 08:01:06
User: P17
Tags: bash

The colors are defined as variables.




ip route show dev ppp0 | awk '{ print $7 }'
screen -dmS "name_me" echo "hi"
2009-05-06 02:04:15
Functions: echo screen

Runs an instance of screen with name of "name_me" and command of "echo "hi""

To reconnect to screen instance later use:

screen -r name_me

vim -x <FILENAME>
2009-05-05 23:24:17
User: denzuko
Functions: vim

While I love gpg and truecrypt there's some times when you just want to edit a file and not worry about keys or having to deal needing extra software on hand. Thus, you can use vim's encrypted file format.

For more info on vim's encrypted files visit: http://www.vim.org/htmldoc/editing.html#encryption

awk -F\" '{print $4}' *.log | grep -v "eviljaymz\|\-" | sort | uniq -c | awk -F\ '{ if($1>500) print $1,$2;}' | sort -n
2009-05-05 22:21:04
User: jaymzcd
Functions: awk grep sort uniq

This prints a summary of your referers from your logs as long as they occurred a certain number of times (in this case 500). The grep command excludes the terms, I add this in to remove results Im not interested in.

Q="reddit|digg"; F=*.log; awk -F\" '{print $4}' $F | egrep $Q | wc -l
2009-05-05 21:51:16
User: jaymzcd
Functions: awk egrep wc

I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter).

For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches).

for %f in (c) do dir %f:\*.jpg /s /p
2009-05-05 18:28:18
User: copremesis
Functions: dir

there is no explicit find command in DOS you can create a batch file with this one and find all jpegs on the C drive ...

note: if creating a batch file "find.bat" the syntax changes to:

for %%f in (c) do dir %%f:\%1 /s /p

you can then use

find *.jpg
ps -eo user,pcpu,pmem | tail -n +2 | awk '{num[$1]++; cpu[$1] += $2; mem[$1] += $3} END{printf("NPROC\tUSER\tCPU\tMEM\n"); for (user in cpu) printf("%d\t%s\t%.2f%\t%.2f%\n",num[user], user, cpu[user], mem[user]) }'
alias xdef_load='xrdb -merge ~/.Xdefaults'
2009-05-05 16:34:06
User: P17
Functions: alias

Reads in the ~/.Xdefaults lexicographically sorted with, instead of replacing, the current contents of the specified properties.

alias b='cd -'
shopt -s globstar
2009-05-05 16:02:44
User: Alanceil

Since bash 4.0, you can use ** to recursively expand to all files in the current directory. This behaviour is disabled by default, this command enables it (you'd best put it in your .profile). See the sample output for clarification.

In my opinion this is much better than creating hacks with find and xargs when you want to pass files to an application.

VBoxManage startvm "name"
sed -i '/Centos/d' VirtualBox.xml
2009-05-05 13:03:55
Functions: sed

Simple but useful command, I use this for purge an hard disk entry in Virtualbox registry file (is in ~user/.Virtualbox) that persist if I erase a Virtual Machine, so I need to delete it manually.

alias somafm='read -p "Which station? "; mplayer --reallyquiet -vo none -ao sdl http://somafm.com/startstream=${REPLY}.pls'
2009-05-05 12:13:46
User: denzuko
Functions: alias

This is the alias command that I discussed in my prior release which you can add to your ~/.bashrc.

This command asks for the station name and then connects to somafm, Great for those who have linux home entertainment boxes and ssh enabled on them, just for the CLI fiends out there ( I know I'm one of them ;)

You can find future releases of this and many more scripts at the teachings of master denzuko - denzuko.co.cc.

lynx -dump randomfunfacts.com | grep -A 3 U | sed 1D
2009-05-05 07:52:10
User: xizdaqrian
Functions: grep sed

This is a working version, though probably clumsy, of the script submitted by felix001. This works on ubuntu and CygWin. This would be great as a bash function, defined in .bashrc. Additionally it would work as a script put in the path.

wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

for file in <directory A>/*; do rm <directory B>/`basename $file`; done
2009-05-04 12:44:50
User: jamiebullock
Functions: file rm
Tags: delete rm

This command is useful if you accidentally untar or unzip an archive in a directory and you want to automatically remove the files. Just untar the files again in a subdirectory and then run the above command e.g.

for file in ~/Desktop/temp/*; do rm ~/Desktop/`basename $file`; done
curl -s 'http://download.finance.yahoo.com/d/quotes.csv?s=csco&f=l1'
2009-05-04 08:13:59
User: haivu
Tags: curl finance

Retrieve the current stock price from Yahoo Finance. The output is simply the latest price (which could be delayed). If you want to look up stock for a different company, replace csco with your symbol.

nl filename | more
2009-05-04 07:35:16
User: haivu
Functions: nl

The nl command lists the contents of a file where is each line is prefixed by a line number. For more information about this command, check out its man page. I tested under Mac OS X and Xubuntu 9.04

lshw -C disk -html > /tmp/diskinfo.html
vmstat 1 10 | /usr/xpg4/bin/awk -f ph-vmstat.awk
2009-05-04 04:55:00
User: MarcoN
Functions: vmstat

% cat ph-vmstat.awk

# Return human readable numbers

function hrnum(a) {

b = a ;

if (a > 1000000) { b = sprintf("%2.2fM", a/1000000) ; }

else if (a > 1000) { b = sprintf("%2.2fK", a/1000) ; }

return(b) ;


# Return human readable storage

function hrstorage(a) {

b = a ;

if (a > 1024000) { b = sprintf("%2.2fG", a/1024/1024) ; }

else if (a > 1024) { b = sprintf("%2.2fM", a/1024) ; }

return(b) ;


OFS=" " ;

$1 !~ /[0-9].*/ {print}

$1 ~ /[0-9].*/ {

$4 = hrstorage($4) ;

$5 = hrstorage($5) ;

$9 = hrnum($9) ;

$10 = hrnum($10) ;

$17 = hrnum($17) ;

$18 = hrnum($18) ;

$19 = hrnum($19) ;

print ;


export LSCOLORS=gxfxcxdxbxegedabagacad
2009-05-04 04:07:36
User: haivu
Functions: export
Tags: bash ls osx

I use terminal with black background on the Mac. Unfortunately, the default ls color for the directory is blue, which is very hard to see. By including the line above in my ~/.bash_profile file, I changed the directory's color to cyan, which is easer to see. For more information on the syntax of the LSCOLORS shell variable:

man ls

I tested this command on Mac OS X Leopard