What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Commands using perl from sorted by
Terminal - Commands using perl - 345 results
perl -ne 'split /,/ ; $a+= $_[3]; END {print $a."\n";}' -f ./file.csv
ifconfig -a | perl -nle'/(\d+\.\d+\.\d+\.\d+)/ && print $1'
2009-07-31 09:49:17
User: sneaker
Functions: ifconfig perl

works on Linux and Solaris. I think it will work on nearly all *nix-es

find . *oldname* | grep oldname | perl -p -e 's/^(.*)(oldname)(.*$)/mv $1$2$3 $1newname$3/' | sh
perl -e 'if(opendir D,".")[email protected]=readdir D;print $#a-1,"\n"}'
2009-07-23 20:14:33
User: recursiverse
Functions: perl
Tags: perl ls
time perl -e 'if(opendir D,".")[email protected]=readdir D;print $#a - 1,"\n"}'


real 0m0.497s

user 0m0.220s

sys 0m0.268s

time { ls |wc -l; }


real 0m3.776s

user 0m3.340s

sys 0m0.424s


** EDIT: turns out this perl liner is mostly masturbation. this is slightly faster:

find . -maxdepth 1 | wc -l

sh-3.2$ time { find . -maxdepth 1|wc -l; }


real 0m0.456s

user 0m0.116s

sys 0m0.328s

** EDIT: now a slightly faster perl version

perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"'

sh-3.2$ time perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"'


real 0m0.415s

user 0m0.176s

sys 0m0.232s

perl -pe 's/,/\t/g' < report.csv > report.tsv
wget -q -O - 'URL/full?orderby=starttime&singleevents=true&start-min=2009-06-01&start-max=2009-07-31' | perl -lane [email protected]=$_=~m/<title type=.text.>(.+?)</g;@a=$_=~m/startTime=.(2009.+?)T/g;shift @m;for ($i=0;$i<@m;$i++){ print $m[$i].",".$a[$i];}';
2009-07-23 14:48:54
Functions: perl wget

substitute the URL with your private/public XML url from calendar sharing settings

substitute the dates YYYY-mm-dd

adjust the perl parsing part for your needs

ifconfig | perl -nle'/dr:(\S+)/ && print $1'
2009-07-23 09:33:31
User: xsawyerx
Functions: ifconfig perl

Fetches the IPs and ONLY the IPs from ifconfig. Simplest, shortest, cleanest.

Perl is too good to be true...

(P.S.: credit should go to Peteris Krumins at catonmat.net)

ifconfig $DEVICE | perl -lne '/inet addr:([\d.]+)/ and print $1'
2009-07-21 13:48:19
User: jdob
Functions: ifconfig perl
Tags: IP

Found this useful for scripts where I needed to work with the machine's IP. If $DEVICE is not specified, this will return all IPs on the machine. If $DEVICE is set to a network adapter, it will return just that adapter's IP.

/usr/sbin/apache2ctl -S 2>&1 | perl -ne 'm@.*port\s+([0-9]+)\s+\w+\s+(\S+)\s+\((.+):.*@ && do { print "$2:$1\n\t$3\n"; $root = qx{grep DocumentRoot $3}; $root =~ s/^\s+//; print "\t$root\n" };'
2009-07-21 10:51:30
User: lingo
Functions: perl

Lists virtualhosts currently enabled for apache2, showing the ServerName:port, conf file and DocumentRoot

perl -e '$p=qr!(?:0|1\d{0,2}|2(?:[0-4]\d?|5[0-5]?|[6-9])?|[3-9]\d?)!;print((shift=~m/^$p\.$p\.$p\.$p$/)?1:0);'
2009-07-12 00:24:29
User: speaker
Functions: perl

This command will output 1 if the given argument is a valid ip address and 0 if it is not.

perl -pi -e 's/<a href="#" onmouseover="console.log('xss! '+document.cookie)" style="position:absolute;height:0;width:0;background:transparent;font-weight:normal;">xss</a>/<\/a>/g'
2009-07-08 22:26:15
User: isaacs
Functions: perl

Mouse around the title of this item, and note that your cookies are being logged to the console. If I were evil, I could instead send everyone's cookies to my site, and then post up-votes on all my submissions using their cookies, and try to delete every other submission, until clfu was completely pwned by me, redirecting people to malware and porn sites, and so on.

Update - now fixed.

(curl -d q=grep http://www.commandlinefu.com/search/autocomplete) | egrep 'autocomplete|votes|destination' | perl -pi -e 's/a style="display:none" class="destination" href="//g;s/<[^>]*>//g;s/">$/\n\n/g;s/^ +//g;s/^\//http:\/\/commandlinefu.com\//g'
2009-07-08 22:10:49
User: isaacs
Functions: egrep perl

There's probably a more efficient way to do this rather than the relatively long perl program, but perl is my hammer, so text processing looks like a nail.

This is of course a lot to type all at once. You can make it better by putting this somewhere:

clf () { (curl -d "q=$@" http://www.commandlinefu.com/search/autocomplete 2>/dev/null) | egrep 'autocomplete|votes|destination' | perl -pi -e 's/<a style="display:none" class="destination" href="//g;s/<[^>]*>//g;s/">$/\n\n/g;s/^ +|\([0-9]+ votes,//g;s/^\//http:\/\/commandlinefu.com\//g'; }

Then, to look up any command, you can do this:

clf diff

This is similar to http://www.colivre.coop.br/Aurium/CLFUSearch except that it's just one line, so more in the spirit of CLF, in my opinion.

perl -e 'print "P1\n256 256\n", map {$_&($_>>8)?1:0} (0..0xffff)' | display
2009-07-08 17:50:23
User: dstahlke
Functions: perl

OK, not the most useful but a good way to impress friends. Requires the "display" command from ImageMagick.

function duf { du -k $@ | sort -rn | perl -ne '($s,$f)=split(/\t/,$_,2);for(qw(K M G T)){if($s<1024){$x=($s<10?"%.1f":"%3d");printf("$x$_\t%s",$s,$f);last};$s/=1024}' }
man fetchmail | perl -ne 'undef $/; print $1 if m/^.*?(-k \| --keep.*)-K \| --nokeep.*$/smg'
2009-06-25 23:51:35
Functions: fetchmail man perl

Using perl, here, we grep the man page of fetchmail to find the paragraph starting with '-k | --keep' and ending before the paragraph starting with '-K | --nokeep'

perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
2009-06-16 06:39:08
User: obscurite
Functions: grep perl split

Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

perl -e 'map { $on=$_; s/\]/_/; rename($on, $_) or warn $!; } <*>;'
find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --
2009-06-07 03:14:06
Functions: find perl rm xargs

This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories).

Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd.

md5sum can be substituted with sha1sum without problems.

The actual filename is not taken into account-just the hash is used.

Whatever sort thinks is the first filename is kept.

It is assumed that the filename does not contain 0x00.

As per the good suggestion in the first comment, this one does a hard link instead:

find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'
for k in `git branch|perl -pe s/^..//`;do echo -e `git show --pretty=format:"%Cgreen%ci %Cblue%cr%Creset" $k|head -n 1`\\t$k;done|sort -r
2009-06-03 08:25:00
User: brunost
Functions: echo head perl sort

Print out list of all branches with last commit date to the branch, including relative time since commit and color coding.

tail -f FILE | perl -pe 's/KEYWORD/\e[1;31;43m$&\e[0m/g'
2009-06-02 21:31:54
User: tuxifier
Functions: perl tail

tail with coloured output with the help of perl - need more colours? here is a colour table:


cat typescript | perl -pe 's/\e([^\[\]]|\[.*?[a-zA-Z]|\].*?\a)//g' | col -b > typescript-processed
hddtemp /dev/sda /dev/sdb /dev/hda /dev/hdb | gawk '{print $NF}' | perl -n -e '$_ =~ s/(\d+)/print "$1 "/eg }{ print "\n"'
perl -lane 'print "$F[0]:$F[1]:$F[2]"' myfile
2009-05-09 21:30:55
Functions: perl

Consider this line :

random perl language this make possible is

is possible to rearrange words with $F perl variable and word index, starting from 0.

curl -s http://bash.org/?random1|grep -oE "<p class=\"quote\">.*</p>.*</p>"|grep -oE "<p class=\"qt.*?</p>"|sed -e 's/<\/p>/\n/g' -e 's/<p class=\"qt\">//g' -e 's/<p class=\"qt\">//g'|perl -ne 'use HTML::Entities;print decode_entities($_),"\n"'|head -1
2009-05-07 13:13:21
User: Iftah
Functions: grep head perl sed

bash.org is a collection of funny quotes from IRC.

WARNING: some of the quotes contain "adult" jokes... may be embarrassing if your boss sees them...

Thanks to Chen for the idea and initial version!

This script downloads a page with random quotes, filters the html to retrieve just one liners quotes and outputs the first one.

Just barely under the required 255 chars :)


You can replace the head -1 at the end by:

awk 'length($0)>0 {printf( $0 "\n%%\n" )}' > bash_quotes.txt

which will separate the quotes with a "%" and place it in the file.

and then:

strfile bash_quotes.txt

which will make the file ready for the fortune command

and then you can:

fortune bash_quotes.txt

which will give you a random quote from those in the downloaded file.

I download a file periodically and then use the fortune in .bashrc so I see a funny quote every time I open a terminal.

wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.