Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using head from sorted by
Terminal - Commands using head - 230 results
function where(){ COUNT=0; while [ `where_arg $1~$COUNT | wc -w` == 0 ]; do let COUNT=COUNT+1; done; echo "$1 is ahead of "; where_arg $1~$COUNT; echo "by $COUNT commits";};function where_arg(){ git log $@ --decorate -1 | head -n1 | cut -d ' ' -f3- ;}
2010-12-08 15:41:39
User: noisy
Functions: cut echo head wc
Tags: git
0

usage:

where COMMIT

for instance:

where 1178c5950d321a8c5cd8294cd67535157e296554

where HEAD~5

echo `cat /dev/urandom | base64 | tr -dc "[:alnum:]" | head -c64`
2010-12-06 21:04:57
User: Dereckson
Functions: echo head tr
-1

The same command, but with a base64 filter, more forgiving for special characters than tr.

echo `cat /dev/urandom |tr -dc "[:alnum:]" | head -c64`
( apache2ctl -t && service apache2 restart || (l=$(apache2ctl -t 2>&1|head -n1|sed 's/.*line\s\([0-9]*\).*/\1/'); vim +$l $(locate apache2.conf | head -n1)))
2010-11-26 18:12:08
User: cicatriz
Functions: head locate sed vim
3

Checks the apache configuration syntax, if is OK then restart the service otherwise opens the configuration file with VIM on the line where the configuration fails.

ps ax -L -o pid,tid,psr,pcpu,args | sort -nr -k4| head -15 | cut -c 1-90
find . -type f -newer 201011151300.txt -exec head -1 {} \;
2010-11-15 22:51:13
User: abm2009
Functions: find head
0

create the "newer than" file by:

touch -t 201011151300 ./201011151300.txt

the format for the time is

[[CC]YY]MMDDhhmm[.SS]

head -n 20 <filename> | tail
curl -s -O http://s3.amazonaws.com/alexa-static/top-1m.csv.zip ; unzip -q -o top-1m.csv.zip top-1m.csv ; head -1000 top-1m.csv | cut -d, -f2 | cut -d/ -f1 > topsites.txt
2010-11-01 01:25:53
User: chrismccoy
Functions: cut head
Tags: curl unzip cut
-4

this will dump a list of domains one per line into a text file

atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.tar.gz} && tar xf $1 -C ${1%.tar.gz}; fi ;}
2010-10-16 05:50:32
User: elfreak
Functions: echo grep head mkdir tar wc
10

This Anti-TarBomb function makes it easy to unpack a .tar.gz without worrying about the possibility that it will "explode" in your current directory. I've usually always created a temporary folder in which I extracted the tarball first, but I got tired of having to reorganize the files afterwards. Just add this function to your .zshrc / .bashrc and use it like this;

atb arch1.tar.gz

and it will create a folder for the extracted files, if they aren't already in a single folder.

This only works for .tar.gz, but it's very easy to edit the function to suit your needs, if you want to extract .tgz, .tar.bz2 or just .tar.

More info about tarbombs at http://www.linfo.org/tarbomb.html

Tested in zsh and bash.

UPDATE: This function works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in zsh (not working in bash):

atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t(ar.gz||ar.bz2||gz||bz||ar)} && tar xf $1 -C ${1%.t(ar.gz||ar.bz2||gz||bz||ar)}; fi ;}

UPDATE2: From the comments; bepaald came with a variant that works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in bash:

atb() {shopt -s extglob ; l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)} && tar xf $1 -C ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)}; fi ; shopt -u extglob}
svn log --stop-on-copy | grep r[0-9] | awk '{print $1}' | sed "s/r//" | sort -n | head -1
curl --silent http://www.dudalibre.com/gnulinuxcounter?lang=en | grep users | head -2 | tail -1 | sed 's/.*<strong>//g' | sed 's/<\/strong>.*//g'
while [ true ]; do head -n 100 /dev/urandom; sleep .1; done | hexdump -C | grep "ca fe"
tr -cd '[:alnum:]' < /dev/urandom | fold -w30 | head -n1
yes '' | head -n100
wget -q $(lynx --dump 'http://geekandpoke.typepad.com/' | grep '\/.a\/' | grep '\-pi' | head -n 1 | awk '{print $2}') -O geekandpoke.jpg
feh --bg-center `ls -U1 |sort -R |head -1`
wget http://forums.dropbox.com && wget $(cat index.html|grep "Latest Forum Build"|cut -d"\"" -f2) && wget $(cat topic.php*|grep "Linux x86:"|cut -d"\"" -f2|sort -r|head -n1) && rm -rf ~/.dropbox* && rm index.html *.php* && tar zxvf dropbox-*.tar.gz -C ~/
LIST="/some/pic/file /another/picture /one/more/pic"; PIC=$(echo $LIST | sed s/"\ "/"\n"/g | shuf | head -1 | sed s/'\/'/'\\\/'/g ); sed -i s/Mrxvt.Pixmap:.*/"Mrxvt.Pixmap:\t$PIC"/ ~/.mrxvtrc
2010-08-23 10:17:42
User: dog
Functions: echo head sed
0

Simple way of having random mrxvt backgrounds. Add this to your bashrc and change the path names for the pictures.

man $(ls /bin | shuf | head -1)
2010-08-20 23:12:51
Functions: head ls man
Tags: man
1

I'm not sure why you would want to do this, but this seems a lot simpler (easier to understand) than the version someone submitted using awk.

curl -L -s `curl -s [http://podcast.com/show.rss]` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
2010-07-31 00:17:47
User: denzuko
Functions: head ssh
0

Gets the latest podcast show from from your favorite Podcast. Uses curl and xmlstarlet.

Make sure you change out the items between brackets.

curl -L -s `curl -s http://www.2600.com/oth-broadband.xml` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
0

Ever wanted to stream your favorite podcast across the network, well now you can.

This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption.

printf $(( echo "obase=16;$(echo $$$(date +%s%N))"|bc; ip link show|sed -n '/eth/ {N; p}'|grep -o -E '([[:xdigit:]]{1,2}:){5}[[:xdigit:]]{1,2}'|head -c 17 )|tr -d [:space:][:punct:] |sed 's/[[:xdigit:]]\{2\}/\\x&/g')|sha1sum|head -c 32; echo
2010-07-14 14:04:53
User: camocrazed
Functions: echo grep head link printf sed tr
Tags: uuid
0

first off, if you just want a random UUID, here's the actual command to use:

uuidgen

Your chances of finding a duplicate after running this nonstop for a year are about the same as being hit by a meteorite before finishing this sentence

The reason for the command I have is that it's more provably unique than the one that uuidgen creates. uuidgen creates a random one by default, or an unencrypted one based on time and network address if you give it the -t option.

Mine uses the mac address of the ethernet interface, the process id of the caller, and the system time down to nanosecond resolution, which is provably unique over all computers past, present, and future, subject to collisions in the cryptographic hash used, and the uniqueness of your mac address.

Warning: feel free to experiment, but be warned that the stdin of the hash is binary data at that point, which may mess up your terminal if you don't pipe it into something. If it does mess up though, just type

reset
tail -n +4 | head -n 1
head -n X | tail -n 1
2010-07-08 22:06:39
User: infinull
Functions: head tail
-1

using tail first won't do it because tail counts from the bottom of the file. You could do it this way but I don't suggest it

tail -n 4 | head -n 1
2010-07-08 19:50:06
User: puddy
Functions: head tail
-7

tail -n X | head -n 1

prints a specific line, where X is the line number