What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,491 results
cat /var/log/nginx/access.log | grep -oe '^[0-9.]\+' | perl -ne 'system("geoiplookup $_")' | grep -v found | grep -oe ', [A-Za-z ]\+$' | sort | uniq -c | sort -n
2012-05-08 13:28:25
User: theist
Functions: cat grep perl sort uniq
Tags: sort uniq geoip

Per country GET report, based on access log. Easy to transform to unique IP

genpassdeep() { cat /dev/urandom | tr -dc [:alnum:] | head -c64 | sha256deep; echo; }
2012-11-09 00:33:22
User: malathion
Functions: cat head tr

/dev/urandom relies on operator input to set the random seed. By itself, this may not contain enough random bits to produce high entropy output, especially if the system was recently restarted. Therefore, key stretching through a hash reduces the risk of using low-entropy output as a security key.

kill -HUP `ps -A -ostat,ppid,pid,cmd | grep -e '^[Zz]' | awk '{print $2}'`
2009-02-06 02:42:14
User: liupeng
Functions: awk grep kill

You cannot kill zombies, as they are already dead. But if you have too many zombies then kill parent process or restart service.

You can kill zombie process using PID obtained from the above command. For example kill zombie proces having PID 4104:

# kill -9 4104

Please note that kill -9 does not guarantee to kill a zombie process.

HISTTIMEFORMAT='' history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head > /tmp/cmds ; gnuplot -persist <<<'plot "/tmp/cmds" using 1:xticlabels(2) with boxes'
2010-06-17 17:38:16
User: narcelio
Functions: awk head sort

This alternative cleans HISTTIMEFORMAT environment variable and calls gnuplot just after /tmp/cmds is closed, to avoid some errors.

xmlproc_parse.python-xml &>/dev/null <FILE> || exit 1
2009-12-11 17:30:03
User: sputnick
Functions: exit
Tags: python xml

For debian likes, that's in python-xml package.

tail() { thbin="/usr/bin/tail"; if [ "${1:0:1}" != "-" ]; then fc=$(($#==0?1:$#)); lpf="$((($LINES - 3 - 2 * $fc) / $fc))"; lpf="$(($lpf<1?2:$lpf))"; [ $fc -eq 1 ] && $thbin -n $lpf "$@" | /usr/bin/fold -w $COLUMNS | $thbin -n $lpf || $thbin -n $lpf...
2012-03-23 19:00:30
User: fpunktk
Functions: tail
tail() { thbin="/usr/bin/tail"; if [ "${1:0:1}" != "-" ]; then fc=$(($#==0?1:$#)); lpf="$((($LINES - 3 - 2 * $fc) / $fc))"; lpf="$(($lpf<1?2:$lpf))"; [ $fc -eq 1 ] && $thbin -n $lpf "$@" | /usr/bin/fold -w $COLUMNS | $thbin -n $lpf || $thbin -n $lpf "$@"; else $thbin "$@"; fi; unset lpf fc thbin; }

This is a function that implements an improved version of tail. It tries to limit the number of lines so that the screen is filled completely. It works with pipes, single and multiple files. If you add different options to tail, they will overwrite the settings from the function.

It doesn't work very well when too many files (with wrapped lines) are specified.

Its optimised for my three-line prompt.

It also works for head. Just s/tail/head/g

Don't set 'thbin="tail"', this might lead to a forkbomb.

qlmanage -p "yourfilename"
2009-02-16 07:15:03
User: vaporub

Where "docname" is the document you want OS-X to image... file.txt, file.pdf, file.mov, etc

cvlc <somemusic.mp3>
while :; do :; done
ifconfig | awk -F"[: ]+" '/inet addr/ {print $4}'
write user anytext
find . -exec grep foobar /dev/null {} \; | awk -F: '{print $1}' | xargs vi
mp32ogg file.mp3
2009-11-16 20:22:48
User: nickleus

why would you want to convert mp3's to ogg? 1 reason is because ardour doesn't support mp3 files because of legal issues. this is really the only reason you would do this, unless you have really bad hearing and also want smaller file sizes, because converting from one lossy format to another isn't a good idea.

jot -b '#' -s '' $COLUMNS
2010-04-13 22:03:39
User: dennisw
Tags: tr tput printf

For BSD-based systems, including OS X, that don't have seq.

This version provides a default using tput in case $COLUMNS is not set:

jot -b '#' -s '' ${COLUMNS:-$(tput cols)}
find / -type f -size +512000 | xargs ls -lh | awk '{ print $5 " " $6$7 ": " $9 }'
2010-05-12 17:21:12
User: johnss
Functions: awk find ls xargs

This is an updated version that some one provided me via another "find" command to find files over a certain size. Keep in mind you may have to mess around with the print values depending on your system to get the correct output you want. This was tested on FC and Cent based servers. (thanks to berta for the update)

net user USERNAME /domain
wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
2009-07-02 01:46:21
User: bbelt16ag
Functions: wget

just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.

cat /proc/cpuinfo
ash prod<tab>
2012-05-12 19:51:02
User: c3w


. a Ruby SSH helper script

. reads a JSON config file to read host, FQDN, user, port, tunnel options

. changes OSX Terminal profiles based on host 'type'


put 'ash' ruby script in your PATH

modify and copy ashrc-dist to ~/.ashrc

configure OSX Terminal profiles, such as "webserver", "development", etc

run "ash myhostname" and away you go!

v.2 will re-attach to a 'screen' named in your ~/.ashrc

curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
2013-05-04 17:43:21
User: bbelt16ag
Functions: cut sort uniq

This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive

find . -name "*.php" -exec grep -il searchphrase {} \;
2010-01-16 05:09:30
Functions: find grep

This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.

du -s * | sort -nr | head
ffmpeg -r 12 -i img%03d.jpg -sameq -s hd720 -vcodec libx264 -crf 25 OUTPUT.MP4
find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
2009-09-03 17:55:26
User: arcege
Functions: find grep xargs
Tags: vim find grep

Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.