What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





All commands from sorted by
Terminal - All commands - 12,273 results
xmlproc_parse.python-xml &>/dev/null <FILE> || exit 1
2009-12-11 17:30:03
User: sputnick
Functions: exit
Tags: python xml

For debian likes, that's in python-xml package.

tail() { thbin="/usr/bin/tail"; if [ "${1:0:1}" != "-" ]; then fc=$(($#==0?1:$#)); lpf="$((($LINES - 3 - 2 * $fc) / $fc))"; lpf="$(($lpf<1?2:$lpf))"; [ $fc -eq 1 ] && $thbin -n $lpf "$@" | /usr/bin/fold -w $COLUMNS | $thbin -n $lpf || $thbin -n $lpf...
2012-03-23 19:00:30
User: fpunktk
Functions: tail
tail() { thbin="/usr/bin/tail"; if [ "${1:0:1}" != "-" ]; then fc=$(($#==0?1:$#)); lpf="$((($LINES - 3 - 2 * $fc) / $fc))"; lpf="$(($lpf<1?2:$lpf))"; [ $fc -eq 1 ] && $thbin -n $lpf "$@" | /usr/bin/fold -w $COLUMNS | $thbin -n $lpf || $thbin -n $lpf "$@"; else $thbin "$@"; fi; unset lpf fc thbin; }

This is a function that implements an improved version of tail. It tries to limit the number of lines so that the screen is filled completely. It works with pipes, single and multiple files. If you add different options to tail, they will overwrite the settings from the function.

It doesn't work very well when too many files (with wrapped lines) are specified.

Its optimised for my three-line prompt.

It also works for head. Just s/tail/head/g

Don't set 'thbin="tail"', this might lead to a forkbomb.

qlmanage -p "yourfilename"
2009-02-16 07:15:03
User: vaporub

Where "docname" is the document you want OS-X to image... file.txt, file.pdf, file.mov, etc

cvlc <somemusic.mp3>
while :; do :; done
ifconfig | awk -F"[: ]+" '/inet addr/ {print $4}'
translate () {lang="ru"; text=`echo $* | sed 's/ /%20/g'`; curl -s -A "Mozilla/5.0" "http://translate.google.com/translate_a/t?client=t&text=$text&sl=auto&tl=$lang" | sed 's/\[\[\[\"//' | cut -d \" -f 1}
2014-07-10 18:26:34
User: 2b
Functions: cut sed

Change lang from ru to something else.

Curl version - Mac OS etc, any system w/o wget.

find . -exec grep foobar /dev/null {} \; | awk -F: '{print $1}' | xargs vi
mp32ogg file.mp3
2009-11-16 20:22:48
User: nickleus

why would you want to convert mp3's to ogg? 1 reason is because ardour doesn't support mp3 files because of legal issues. this is really the only reason you would do this, unless you have really bad hearing and also want smaller file sizes, because converting from one lossy format to another isn't a good idea.

jot -b '#' -s '' $COLUMNS
2010-04-13 22:03:39
User: dennisw
Tags: tr tput printf

For BSD-based systems, including OS X, that don't have seq.

This version provides a default using tput in case $COLUMNS is not set:

jot -b '#' -s '' ${COLUMNS:-$(tput cols)}
sed -i 's/[ \t]\+$//g' file.txt
2011-09-07 01:47:44
User: elder
Functions: sed
Tags: sed regex

This command is useful when you are programming, for example.

net user USERNAME /domain
wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
2009-07-02 01:46:21
User: bbelt16ag
Functions: wget

just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.

cat /proc/cpuinfo
ash prod<tab>
2012-05-12 19:51:02
User: c3w


. a Ruby SSH helper script

. reads a JSON config file to read host, FQDN, user, port, tunnel options

. changes OSX Terminal profiles based on host 'type'


put 'ash' ruby script in your PATH

modify and copy ashrc-dist to ~/.ashrc

configure OSX Terminal profiles, such as "webserver", "development", etc

run "ash myhostname" and away you go!

v.2 will re-attach to a 'screen' named in your ~/.ashrc

curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
2013-05-04 17:43:21
User: bbelt16ag
Functions: cut sort uniq

This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive

find . -name "*.php" -exec grep -il searchphrase {} \;
2010-01-16 05:09:30
Functions: find grep

This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.

du -s * | sort -nr | head
ffmpeg -r 12 -i img%03d.jpg -sameq -s hd720 -vcodec libx264 -crf 25 OUTPUT.MP4
find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
2009-09-03 17:55:26
User: arcege
Functions: find grep xargs
Tags: vim find grep

Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.

pear config-set http_proxy http://myusername:mypassword@corporateproxy:8080
2010-05-13 14:44:03
User: KoRoVaMiLK

Useful since

"export http_proxy=blahblah:8080"

doesn't seem to work with pear

aptitude show $PROGRAM | grep Vers
2009-02-27 23:24:37
User: aabilio
Functions: grep

Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers

Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.

xrandr -q | grep -w Screen
file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
2010-09-24 19:44:32
User: damncool
Functions: file sed seq

splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ...

The psselect commad is part of the psutils package