Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using perl from sorted by
Terminal - Commands using perl - 335 results
perl -pi -e 's/<a href="#" onmouseover="console.log('xss! '+document.cookie)" style="position:absolute;height:0;width:0;background:transparent;font-weight:normal;">xss</a>/<\/a>/g'
2009-07-08 22:26:15
User: isaacs
Functions: perl
8

Mouse around the title of this item, and note that your cookies are being logged to the console. If I were evil, I could instead send everyone's cookies to my site, and then post up-votes on all my submissions using their cookies, and try to delete every other submission, until clfu was completely pwned by me, redirecting people to malware and porn sites, and so on.

Update - now fixed.

(curl -d q=grep http://www.commandlinefu.com/search/autocomplete) | egrep 'autocomplete|votes|destination' | perl -pi -e 's/a style="display:none" class="destination" href="//g;s/<[^>]*>//g;s/">$/\n\n/g;s/^ +//g;s/^\//http:\/\/commandlinefu.com\//g'
2009-07-08 22:10:49
User: isaacs
Functions: egrep perl
1

There's probably a more efficient way to do this rather than the relatively long perl program, but perl is my hammer, so text processing looks like a nail.

This is of course a lot to type all at once. You can make it better by putting this somewhere:

clf () { (curl -d "q=$@" http://www.commandlinefu.com/search/autocomplete 2>/dev/null) | egrep 'autocomplete|votes|destination' | perl -pi -e 's/<a style="display:none" class="destination" href="//g;s/<[^>]*>//g;s/">$/\n\n/g;s/^ +|\([0-9]+ votes,//g;s/^\//http:\/\/commandlinefu.com\//g'; }

Then, to look up any command, you can do this:

clf diff

This is similar to http://www.colivre.coop.br/Aurium/CLFUSearch except that it's just one line, so more in the spirit of CLF, in my opinion.

perl -e 'print "P1\n256 256\n", map {$_&($_>>8)?1:0} (0..0xffff)' | display
2009-07-08 17:50:23
User: dstahlke
Functions: perl
25

OK, not the most useful but a good way to impress friends. Requires the "display" command from ImageMagick.

function duf { du -k $@ | sort -rn | perl -ne '($s,$f)=split(/\t/,$_,2);for(qw(K M G T)){if($s<1024){$x=($s<10?"%.1f":"%3d");printf("$x$_\t%s",$s,$f);last};$s/=1024}' }
man fetchmail | perl -ne 'undef $/; print $1 if m/^.*?(-k \| --keep.*)-K \| --nokeep.*$/smg'
2009-06-25 23:51:35
Functions: fetchmail man perl
0

Using perl, here, we grep the man page of fetchmail to find the paragraph starting with '-k | --keep' and ending before the paragraph starting with '-K | --nokeep'

perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
2009-06-16 06:39:08
User: obscurite
Functions: grep perl split
3

Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

perl -e 'map { $on=$_; s/\]/_/; rename($on, $_) or warn $!; } <*>;'
find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --
2009-06-07 03:14:06
Functions: find perl rm xargs
19

This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories).

Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd.

md5sum can be substituted with sha1sum without problems.

The actual filename is not taken into account-just the hash is used.

Whatever sort thinks is the first filename is kept.

It is assumed that the filename does not contain 0x00.

As per the good suggestion in the first comment, this one does a hard link instead:

find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'
for k in `git branch|perl -pe s/^..//`;do echo -e `git show --pretty=format:"%Cgreen%ci %Cblue%cr%Creset" $k|head -n 1`\\t$k;done|sort -r
2009-06-03 08:25:00
User: brunost
Functions: echo head perl sort
14

Print out list of all branches with last commit date to the branch, including relative time since commit and color coding.

tail -f FILE | perl -pe 's/KEYWORD/\e[1;31;43m$&\e[0m/g'
2009-06-02 21:31:54
User: tuxifier
Functions: perl tail
17

tail with coloured output with the help of perl - need more colours? here is a colour table:

http://www.tuxify.de/?p=23

cat typescript | perl -pe 's/\e([^\[\]]|\[.*?[a-zA-Z]|\].*?\a)//g' | col -b > typescript-processed
hddtemp /dev/sda /dev/sdb /dev/hda /dev/hdb | gawk '{print $NF}' | perl -n -e '$_ =~ s/(\d+)/print "$1 "/eg }{ print "\n"'
perl -lane 'print "$F[0]:$F[1]:$F[2]"' myfile
2009-05-09 21:30:55
Functions: perl
0

Consider this line :

random perl language this make possible is

is possible to rearrange words with $F perl variable and word index, starting from 0.

curl -s http://bash.org/?random1|grep -oE "<p class=\"quote\">.*</p>.*</p>"|grep -oE "<p class=\"qt.*?</p>"|sed -e 's/<\/p>/\n/g' -e 's/<p class=\"qt\">//g' -e 's/<p class=\"qt\">//g'|perl -ne 'use HTML::Entities;print decode_entities($_),"\n"'|head -1
2009-05-07 13:13:21
User: Iftah
Functions: grep head perl sed
7

bash.org is a collection of funny quotes from IRC.

WARNING: some of the quotes contain "adult" jokes... may be embarrassing if your boss sees them...

Thanks to Chen for the idea and initial version!

This script downloads a page with random quotes, filters the html to retrieve just one liners quotes and outputs the first one.

Just barely under the required 255 chars :)

Improvment:

You can replace the head -1 at the end by:

awk 'length($0)>0 {printf( $0 "\n%%\n" )}' > bash_quotes.txt

which will separate the quotes with a "%" and place it in the file.

and then:

strfile bash_quotes.txt

which will make the file ready for the fortune command

and then you can:

fortune bash_quotes.txt

which will give you a random quote from those in the downloaded file.

I download a file periodically and then use the fortune in .bashrc so I see a funny quote every time I open a terminal.

wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget
-4

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

for i in `screen -ls | perl -ne'if(/^\s+\d+\.([^\s]+)/){print $1, " "}'`; do gnome-terminal -e "screen -x $i"; done
2009-04-25 22:39:24
User: hank
Functions: perl
Tags: screen Linux perl
2

There was another line that was dependent on having un-named screen sessions. This just wouldn't do. This one works no matter what the name is. A possible improvement would be removing the perl dependence, but that doesn't effect me.

ldapsearch -v -H ldap://<server> -x -D cn=<johndoe>,cn=<users>,dc=<ourdomain>,dc=<tld> -w<secret> -b ou=<lazystaff>,dc=<ourdomain>,dc=<tld> -s sub sAMAccountName=* '*' | perl -pne 's/(\d{11})\d{7}/"DATE-AD(".scalar(localtime($1-11644473600)).")"/e'
2009-04-22 00:57:34
User: flux
Functions: perl
4

When Ldapsearch queries an Active directory server, all the dates are shown using a timestamp of 18 digits. This perl regexp decodes them in a more human friendly notation. 11644473600 corresponds to some microsoft epoch.

cat <filename> | perl -e '$/ = ""; $_ = <>; s/<!--.*?-->//gs; print;'
2009-04-15 20:29:11
User: unixx
Functions: cat perl
0

xml with verbose commenting can be difficult to read. remove comments from xml.

$ perl -pi -e 's/\r\n/\n/g' <finelame>
2009-04-10 22:22:31
Functions: perl
Tags: perl dos2unix
0

Converts windows lined-style file to unix.

see http://en.wikipedia.org/wiki/Newline

Can be used to convert from linux2dos : just invert \r\n and \n.

u=`curl -d 'dl.start=Free' $(curl $1|perl -wpi -e 's/^.*"(http:\/\/rs.*)" method.*$/$1/'|egrep '^http'|head -n1)|grep "Level(3) \#2"|perl -wpi -e 's/^.*(http:\/\/rs[^\\\\]*).*$/$1/'`;sleep 60;wget $u
perl -ne 'while (/([0-9]+\.){3}[0-9]+/g) {print "$&\n"};' file.txt
watch -tn1 'bc<<<"`date -d'\''friday 21:00'\'' +%s`-`date +%s`"|perl -ne'\''@p=gmtime($_);printf("%dd %02d:%02d:%02d\n",@p[7,2,1,0]);'\'
2009-03-29 19:53:36
User: penpen
Functions: perl watch
Tags: Linux unix date
-2

An improved version of http://www.commandlinefu.com/commands/view/1772/simple-countdown-from-a-given-date that uses Perl to pretty-print the output. Note that the GNU-style '--no-title' option has been replaced by its one-letter counterpart '-t'.

perl -pi -e 's/foo/bar/g' $(grep -rl foo ./*)
2009-03-27 17:21:35
User: dopeman
Functions: grep perl
Tags: perl
-1

This command will replace all instances of 'foo' with 'bar' in all files in the current working directory and any sub-directories.

perl -pi -e 's/foo/bar/g' $(grep -l foo ./*)
2009-03-27 17:18:08
User: dopeman
Functions: grep perl
-1

This command will replace all instances of 'foo' with 'bar' in all files in the current working directory.

function crtonl { perl -i -ape 's/\r/\n/g;' $* ; }
2009-03-25 20:28:32
User: totoro
Functions: perl
Tags: files
-2

Many Mac OS X programs, especially those in Microsoft:Office, create ASCII files with lines terminated by CRs (carriage returns). Most Unix programs expect lines separated by NLs (newlines). This little command makes it trivial to convert them.