What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,587 results
for i in `svn status | egrep '^(M|A)' | sed -r 's/\+\s+//' | awk '{ print $2 }'` ; do if [ ! -d $i ] ; then php -l $i ; fi ; done
2009-05-29 23:59:28
Functions: awk egrep sed
Tags: svn Linux PHP

Really only valuable in a PHP-only project directory. This is using standard linux versions of the tools. On most older BSD variants of sed, use -E instead of -r. Or use: sed 's/\+[[:space:]]\{1,\}//' instead.

dd if=/dev/sda5 bs=2048 conv=noerror,sync | gzip -fc | lftp -u user,passwd domain.tld -e "put /dev/stdin -o backup-$(date +%Y%m%d%H%M).gz; quit"
ioreg -lw0 | grep IODisplayEDID | sed "/[^<]*</s///" | xxd -p -r | strings -6
gmetric -n $METRIC_NAME -v foo -t string -d 10
2009-05-29 14:21:24
User: root
Tags: ganglia

The -d flag sets the lifetime of a metric and defaults to 0 hence why old metrics continue to be graphed in the dashboard. Submitting a dummy value and short lifetime ensures that the metric is removed from the dashboard.

more /var/adm/messages
2009-05-29 12:10:18
User: miccaman
Functions: more
Tags: solaris

read system logs of sun solaris 9

init 6
2009-05-29 07:44:05
User: miccaman
Functions: init
Tags: solaris Reboot

init states on solaris are numbered

init 0 boot with prompt

init 5 shutdown

init 6 reboot

awk 'BEGIN{srand()}{print rand(),$0}' SOMEFILE | sort -n | cut -d ' ' -f2-
2009-05-29 01:20:50
User: axelabs
Functions: awk cut sort
Tags: sort awk random

This appends a random number as a first filed of all lines in SOMEFILE then sorts by the first column and finally cuts of the random numbers.

awk 'BEGIN{size=5} {mod=NR%size; if(NR<=size){count++}else{sum-=array[mod]};sum+=$1;array[mod]=$1;print sum/count}' file.dat
2009-05-29 00:07:24
User: mungewell
Functions: awk

Sometimes jittery data hides trends, performing a rolling average can give a clearer view.

cat typescript | perl -pe 's/\e([^\[\]]|\[.*?[a-zA-Z]|\].*?\a)//g' | col -b > typescript-processed
zip -vr example.zip example/ -x "*.DS_Store"
2009-05-28 20:28:12
User: JadedEvan

If you want to generate a cross-platform compatible zip file and ignore the Finder's hidden metadata directory

apropos keyword
egrep -o '[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}' file.txt
find <directory path> -mtime +365 -and -not -type d -delete
git grep -l "your grep string" | xargs gedit
find ./ -type f -exec sed -i 's/\t/ /g' {} \;
grep -PL "\t" -r . | grep -v ".svn" | xargs sed -i 's/\t/ /g'
2009-05-28 08:52:14
User: root
Functions: grep sed xargs

Note that this assumes the application is an SVN checkout and so we have to throw away all the .svn files before making the substitution.

echo string | tr '[:lower:]' '[:upper:]'
sort -n <( for i in $(find . -maxdepth 1 -mindepth 1 -type d); do echo $(find $i | wc -l) ": $i"; done;)
sudo dd if=/dev/zero of=/swapfile bs=1024 count=1024000;sudo mkswap /swapfile; sudo swapon /swapfile
2009-05-27 21:10:50
User: dcabanis
Functions: dd mkswap sudo swapon

Create a temporary file that acts as swap space. In this example it's a 1GB file at the root of the file system. This additional capacity is added to the existing swap space.

ruby -e "puts (1..20).map {rand(10 ** 10).to_s.rjust(10,'0')}"
2009-05-27 19:52:53
User: sil

There's been a few times I've needed to create random numbers. Although I've done so in PERL, I've found Ruby is actually faster. This script generates 20 random "10" digit number NOT A RANDOM NUMBER. Replace 20 (1..20) with the amount of random numbers you need generated

find . -uid 0 -print0 | xargs -0 chown foo:foo
2009-05-27 19:52:13
User: abcde
Functions: chown find xargs

In the example, uid 0 is root. foo:foo are the user:group you want to make owner and group. '.' is the "current directory and below." -print0 and -0 indicate that filenames and directories "are terminated by a null character instead of by whitespace."

rar a -m5 -v5M -R myarchive.rar /home/
2009-05-27 15:53:18
User: piovisqui

a - archive

m5 - compression level, 0= lowest compression...1...2...3...4...5= max compression

-v5M split the output file in 5 megabytes archives, change to 700 for a CD, or 4200 for a DVD

R recursive for directories, do not use it for files

It's better to have the output of a compression already split than use the 'split' command after compression, would consume the double amount of disk space. Found at http://www.ubuntu-unleashed.com/2008/05/howto-create-split-rar-files-in-ubuntu.html

history -c
touch -amct [[CC]YY]MMDDhhmm[.ss] FILE
2009-05-27 14:33:22
User: sharfah
Functions: touch

-a for access time, -m for modification time, -c do not create any files, -t timestamp

(($RANDOM%6)) || echo 'hello world!'
2009-05-27 08:11:08
User: luishka
Functions: echo

ramdomize the execution of the command echo 'hello world!'