Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using head from sorted by
Terminal - Commands using head - 237 results
ls -S|head -1find
finger | grep $(whoami) | head -n1 | awk '{print $2 " " $3}'
2011-01-07 16:54:11
User: mogoh
Functions: awk finger grep head
0

finger - gets logged in users

grep $(whoami) - greps only the current user (if there are more logged in)

head -n1 - just one line

awk '{print $2 " " $3}' - second and third word, seperated with a space (the users name)

OT: My first commandlinefu-command :)

seq -w 50 | sort -R | head -6 |fmt|tr " " "-"
find -type f | xargs -I{} du -s "{}" | sort -rn | head | cut -f2 | xargs -I{} du -sh "{}"
2011-01-04 11:10:56
User: glaudiston
Functions: cut du find head sort xargs
-1

Show the top file size in human readable form

head -100000 /dev/urandom | strings|tr '[A-Z]' '[a-z]'|sort >temp.txt && wget -q http://www.mavi1.org/web_security/wordlists/webster-dictionary.txt -O-|tr '[A-Z]' '[a-z]'|sort >temp2.txt&&comm -12 temp.txt temp2.txt
head -100000 /dev/urandom | strings > temp.txt && for w in $(cat webster-dictionary.txt); do if [ ${#w} -gt 3 ]; then grep -io $w temp.txt; fi; done
(netstat -atn | awk '{printf "%s\n%s\n", $4, $4}' | grep -oE '[0-9]*$'; seq 32768 61000) | sort -n | uniq -u | head -n 1
function where(){ COUNT=0; while [ `where_arg $1~$COUNT | wc -w` == 0 ]; do let COUNT=COUNT+1; done; echo "$1 is ahead of "; where_arg $1~$COUNT; echo "by $COUNT commits";};function where_arg(){ git log $@ --decorate -1 | head -n1 | cut -d ' ' -f3- ;}
2010-12-08 15:41:39
User: noisy
Functions: cut echo head wc
Tags: git
0

usage:

where COMMIT

for instance:

where 1178c5950d321a8c5cd8294cd67535157e296554

where HEAD~5

echo `cat /dev/urandom | base64 | tr -dc "[:alnum:]" | head -c64`
2010-12-06 21:04:57
User: Dereckson
Functions: echo head tr
-1

The same command, but with a base64 filter, more forgiving for special characters than tr.

echo `cat /dev/urandom |tr -dc "[:alnum:]" | head -c64`
( apache2ctl -t && service apache2 restart || (l=$(apache2ctl -t 2>&1|head -n1|sed 's/.*line\s\([0-9]*\).*/\1/'); vim +$l $(locate apache2.conf | head -n1)))
2010-11-26 18:12:08
User: cicatriz
Functions: head locate sed vim
3

Checks the apache configuration syntax, if is OK then restart the service otherwise opens the configuration file with VIM on the line where the configuration fails.

ps ax -L -o pid,tid,psr,pcpu,args | sort -nr -k4| head -15 | cut -c 1-90
find . -type f -newer 201011151300.txt -exec head -1 {} \;
2010-11-15 22:51:13
User: abm2009
Functions: find head
0

create the "newer than" file by:

touch -t 201011151300 ./201011151300.txt

the format for the time is

[[CC]YY]MMDDhhmm[.SS]

head -n 20 <filename> | tail
curl -s -O http://s3.amazonaws.com/alexa-static/top-1m.csv.zip ; unzip -q -o top-1m.csv.zip top-1m.csv ; head -1000 top-1m.csv | cut -d, -f2 | cut -d/ -f1 > topsites.txt
2010-11-01 01:25:53
User: chrismccoy
Functions: cut head
Tags: curl unzip cut
-4

this will dump a list of domains one per line into a text file

atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.tar.gz} && tar xf $1 -C ${1%.tar.gz}; fi ;}
2010-10-16 05:50:32
User: elfreak
Functions: echo grep head mkdir tar wc
10

This Anti-TarBomb function makes it easy to unpack a .tar.gz without worrying about the possibility that it will "explode" in your current directory. I've usually always created a temporary folder in which I extracted the tarball first, but I got tired of having to reorganize the files afterwards. Just add this function to your .zshrc / .bashrc and use it like this;

atb arch1.tar.gz

and it will create a folder for the extracted files, if they aren't already in a single folder.

This only works for .tar.gz, but it's very easy to edit the function to suit your needs, if you want to extract .tgz, .tar.bz2 or just .tar.

More info about tarbombs at http://www.linfo.org/tarbomb.html

Tested in zsh and bash.

UPDATE: This function works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in zsh (not working in bash):

atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t(ar.gz||ar.bz2||gz||bz||ar)} && tar xf $1 -C ${1%.t(ar.gz||ar.bz2||gz||bz||ar)}; fi ;}

UPDATE2: From the comments; bepaald came with a variant that works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in bash:

atb() {shopt -s extglob ; l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)} && tar xf $1 -C ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)}; fi ; shopt -u extglob}
svn log --stop-on-copy | grep r[0-9] | awk '{print $1}' | sed "s/r//" | sort -n | head -1
curl --silent http://www.dudalibre.com/gnulinuxcounter?lang=en | grep users | head -2 | tail -1 | sed 's/.*<strong>//g' | sed 's/<\/strong>.*//g'
while [ true ]; do head -n 100 /dev/urandom; sleep .1; done | hexdump -C | grep "ca fe"
tr -cd '[:alnum:]' < /dev/urandom | fold -w30 | head -n1
yes '' | head -n100
wget -q $(lynx --dump 'http://geekandpoke.typepad.com/' | grep '\/.a\/' | grep '\-pi' | head -n 1 | awk '{print $2}') -O geekandpoke.jpg
feh --bg-center `ls -U1 |sort -R |head -1`
wget http://forums.dropbox.com && wget $(cat index.html|grep "Latest Forum Build"|cut -d"\"" -f2) && wget $(cat topic.php*|grep "Linux x86:"|cut -d"\"" -f2|sort -r|head -n1) && rm -rf ~/.dropbox* && rm index.html *.php* && tar zxvf dropbox-*.tar.gz -C ~/
LIST="/some/pic/file /another/picture /one/more/pic"; PIC=$(echo $LIST | sed s/"\ "/"\n"/g | shuf | head -1 | sed s/'\/'/'\\\/'/g ); sed -i s/Mrxvt.Pixmap:.*/"Mrxvt.Pixmap:\t$PIC"/ ~/.mrxvtrc
2010-08-23 10:17:42
User: dog
Functions: echo head sed
0

Simple way of having random mrxvt backgrounds. Add this to your bashrc and change the path names for the pictures.