What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



All commands from sorted by
Terminal - All commands - 12,342 results
for url in `cat urls `; do title=`curl $url 2>&1 | grep -i '<title>.*</title>'` && curl $url > /tmp/u && mail -s "$title" [email protected] < /tmp/u ; done
2010-10-16 19:10:19
Functions: grep mail

Note, you need to replace the email address with your private Instapaper email address.

There are a bunch of possible improvements such as,

- Not writing a temp file

- Doesnt strip tags (tho Instapaper does thankfully)

- Shouldnt require 2 curls

find / -type f -name IMG_????.JPG -print0 |xargs -0 exiv2 -g Exif.Canon.ModelID '{}' |grep A520 |rev |cut --complement -d " " -f1-40 |rev |xargs -I {} cp --parents {} /where
2012-03-10 03:01:01
User: fladam
Functions: cp cut find grep rev xargs

You must spezify /where folder and / folder

If you have another camera you must experiment with Exif data (after -g and after grep) and mask of your photo files IMG_????.JPG

I have do it on Knoppix 6.7.0

You must have installed exiv2.

echo foo | ncat [ip address] [port]
2012-10-26 10:53:47
User: dragonauta
Functions: echo

you can use a pair of commands to test firewalls.

1st launch this command at destination machine

ncat -l [-u] [port] | cat

then use this command at source machine to test remote port

echo foo | ncat [-u] [ip address] [port]

First command will listen at specified port.

It will listen TCP. If you use -u option will listen UDP.

Second command will send "foo" through ncat and will reach defined IP and port.

perl -ne 'print "$. - $_"' infile.txt
2009-12-08 15:27:39
User: netp
Functions: perl

This command prints all lines of a file together with is line number.

curl -s http://twitter.com/users/show.xml?screen_name=username | sed -n 's/\<followers_count\>//p' | sed 's/<[^>]*>//g;/</N;//b'
2010-10-17 16:08:46
User: chrismccoy
Functions: sed
Tags: twitter

replace username with the username you wish to check.

tar cfJ tarfile.tar.xz pathnames
2010-11-18 05:34:17
User: jasonjgw
Functions: tar

The J option is a recent addition to GNU tar. The xz compression utility is required as well.

url=`curl http://proxybay.info/ | awk -F'href="|" |">|</' '{for(i=2;i<=NF;i=i+4) print $i,$(i+2)}' | grep follow|sed 's/^.\{19\}//'|shuf -n 1` && firefox $url
2014-10-04 19:08:13
User: dunryc
Functions: awk grep sed

polls the pirate bay mirrors list and chooses a random site and opens it for you in firefox

grep -n . datafile ;
wget --load-cookies <cookie-file> -c -i <list-of-urls>
alias foo="!!"
2010-08-12 23:42:15
User: smop
Functions: alias

!! will expand to your previous command, thus creating the alias "foo" (does not work consistently for commands with quotation marks)

pbpaste | coffee -bcsp | tail -n +2
2013-09-13 04:50:27
User: roryokane
Functions: tail

This particular combination of flags mimics Try CoffeeScript (on http://coffeescript.org/#try:) as closely as possible. And the `tail` call removes the comment `// Generated by CoffeeScript 1.6.3`.

See `coffee -h` for explanation of `coffee`'s flags.

httpd2 -M
tar xfzO <backup_name>.tar.gz | mysql -u root <database_name>
2011-02-10 22:18:42
User: alecnmk
Functions: tar

`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)

nmap -sP -PR -oG - `/sbin/ip -4 addr show | awk '/inet/ {print $2}' | sed 1d`
2011-07-21 11:50:26
User: l3k
Functions: awk sed

Today many hosts are blocking traditional ICMP echo replay for an "security" reason, so nmap's fast ARP scan is more usable to view all live IPv4 devices around you. Must be root for ARP scanning.

ls -d1a /var/www/*/web | xargs du -hs
2010-10-18 17:16:23
User: DRoBeR
Functions: du ls xargs

Calculate foldersize for each website on an ISPConfig environment. It doesn't add the jail size. Just the "public_html".

grep --color -R "text" directory/
rename *.JPG *.jpg
2014-03-05 14:54:33
User: gtoal
Functions: rename
Tags: batch rename

# Limited and very hacky wildcard rename

# works for rename *.ext *.other

# and for rename file.* other.*

# but fails for rename file*ext other*other and many more

# Might be good to merge this technique with mmv command...

mv-helper() {

argv="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //'`"

files="`echo \"$argv\"|sed -e \"s/ .*//\"`"

str="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //' | tr -d \*`"

set -- $str

for file in $files


echo mv $file `echo $file|sed -e "s/$1/$2/"`

mv $file `echo $file|sed -e "s/$1/$2/"`



alias rename='mv-helper #'

httpd2 -V
tar jcpf /home/[usuario]/etc-$(hostname)-backup-$(date +%Y%m%d-%H%M%S).tar.bz2 /etc
2011-04-29 22:53:11
User: mack
Functions: date tar

Simple Compressed Backup of the /etc

Linux compatible

ps afx | grep defunct -B 1 | grep -Eo "[0-9]{3,}" | xargs kill -9
2012-04-27 16:16:34
User: pholz
Functions: grep kill ps xargs

defunct processes (zombies) usually have to be killed by killing their parent processes. this command retrieves such zombies and their immediate parents and kills all of the matching processes.

cat /dev/zero > /dev/sda
2013-09-13 21:47:48
User: fhh
Functions: cat

Exactly the same effect with 3 less characters ;-) (Removes all files/filesystems of a harddisk. It removes EVERYTHING of your hard disk. Be careful when to select a device.)

You can press Ctrl + C after few seconds

(No output)

mtr www.google.com
folder=0;mkdir $folder; while find -maxdepth 1 -type f -exec mv "{}" $folder \; -quit ; do if [ $( ls $folder | wc -l ) -ge 100 ]; then folder=$(( $folder + 1 )); mkdir $folder; fi ; done
2011-02-11 21:28:01
User: Juluan
Functions: find ls mkdir mv wc

If you have a folder with thousand of files and want to have many folder with only 100 file per folder, run this.

It will create 0/,1/ etc and put 100 file inside each one.

But find will return true even if it don't find anything ...

vim -p `ls *.java *.xml *.txt *.bnd 2>/dev/null`
lsof -i
2011-10-03 02:06:30
User: shsingh

This option selects the listing of all Internet and x.25 (HP-UX) network files.