What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

All commands from sorted by
Terminal - All commands - 12,364 results
curl "http://www.house.gov/house/MemberWWW.shtml" 2>/dev/null | sed -e :a -e 's/<[^>]*>//g;/</N;//ba' | perl -nle 's/^\t\t(.*$)/ $1/ and print;'
2009-09-24 23:37:36
User: drewk
Functions: perl sed
Tags: perl sed curl

Uses curl to download page of membership of US Congress. Use sed to strip HTML then perl to print a line starting with two tabs (a line with a representative)

add-apt-repository [REPOSITORY]
awk -F $'\t' '{printf $1 LS $2 LS $3 LS $4 LS $5; for (i = 7; i < NF; i++) printf $i "\t"; printf "\n--\n";}' LS=$'\n' 'Ad report.tsv' | column -t -s $'\t'
2011-02-28 10:52:16
User: zhangweiwu
Functions: awk column printf

The exported TSV file of Google Adwords' first five columns are text, they usually should collapse into one cell, a multi-line text cell, but there is no guaranteed way to represent line-break within cells for .tsv file format, thus Google split it to 5 columns.

The problem is, with 5 columns of text, there are hardly space to put additional fields while maintain printable output.

This script collapses the first five columns of each row into one single multi-line text cell, for console output or direct send to printer.

find ~ -name '*.sqlite' -exec sqlite3 '{}' 'VACUUM;' \;
unzip /surce/file.zip -d /dest/
wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | grep -v '^$'
find garbage/ -type f -delete
2013-10-21 23:26:51
User: pdxdoughnut
Functions: find

I _think_ you were trying to delete files whether or not they had spaces. This would do that. You should probably be more specific though.

wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | tail -n4
2009-09-25 02:36:46
User: dakunesu
Functions: sed tail wget

"get Hong Kong weather infomation from HK Observatory

From Hong Kong Observatory wap site ;)"

other one showed alot of blank lines for me

find -maxdepth 1 -type f -name "*.7z" -exec 7zr e '{}' ';'
2010-01-23 19:50:10
User: minnmass
Functions: find

Use find's built-in ability to call programs.


find -maxdepth 1 -type f -name "*.7z" -print0 | xargx -0 -n 1 7zr e

would work, too.

setserial -g /dev/ttyS[0-9]* | grep -v "unknown"
test $((RANDOM%2)) -eq 0
sudo add-apt-repository ppa:PPA_TO_ADD
2014-04-24 20:02:39
User: KlfJoat
Functions: sudo

There is no longer a need to add PGP keys for Ubuntu Launchpad PPA's.

The add-apt-repository command creates a new file for the PPA in /etc/sources.list.d/ then adds the PPA's keys to the apt keyring automatically. No muss, no fuss.

unzip -l files.zip
2015-10-15 07:07:42
User: erez83

View files in ZIP archive

unzip -l files.zip

perl -pi -e "s/\r/\n/g" <file>
2010-07-29 16:07:36
User: din7
Functions: perl

Replace DOS character ^M with newline using perl inline replace.

eog `curl 'http://xkcd.com/' | awk -F "ng): |</h" '/embedding/{print $2}'`
curl ifconfig.me
wget --spider -o wget.log -e robots=off --wait 1 -r -p http://www.example.com
2011-04-05 13:42:14
User: lele
Functions: wget

This will visit recursively all linked urls starting from the specified URL. It won't save anything locally and it will produce a detailed log.

Useful to find broken links in your site. It ignores robots.txt, so just use it on a site you own!

find . -name '*.java' -o -name '*.xml' | grep -v '\.svn' | xargs wc -l
2011-06-30 12:45:40
User: ewilson
Functions: find grep wc xargs
Tags: find grep wc

There's nothing particularly novel about this combination of find, grep, and wc, I'm just putting it here in case I want it again.

sudo port installed | grep -v 'active\|The' | xargs sudo port uninstall
squidclient mgr:info | grep "file desc"
2010-07-29 17:35:20
User: KoRoVaMiLK
Functions: grep

Shows useful informations about file descriptors in Squid web proxy

makeself <archive_dir> <file_name> <label>
2012-01-10 18:08:50
User: totti

Used by virtualbox and others to create '.run' file.

echo -e 'alias exit='\''pwd > ~/.lastdir;exit'\''\n[ -n "$(cat .lastdir 2>/dev/null)" ] && cd "$(cat .lastdir)"' >> ~/.bash_aliases
2014-01-28 18:02:04
User: ichbins
Functions: cd echo
Tags: exit pwd

this command will add the following two lines into the ~/.bash_aliases:

alias exit='pwd > ~/.lastdir;exit'

[ -n "$(cat .lastdir 2>/dev/null)" ] && cd "$(cat .lastdir)"

or redirect it to the ~/.bashrc if you like

Donno, I find it usefull. You may also define an alias for 'cd ~' like - alias cdh='cd ~'

function sshdel { perl -i -n -e "print unless (\$. == $1)" ~/.ssh/known_hosts; }
2009-02-03 16:20:50
User: xsawyerx
Functions: perl

sometimes you got conflicts using SSH (host changing ip, ip now belongs to a different machine) and you need to edit the file and remove the offending line from known_hosts. this does it much easier.

Use history -S in your .logout file
2009-02-07 10:53:14
User: vijucat

That's the key part.

I got this from http://www.macosxhints.com/article.php?story=20070715091413640. See that article for other other, more basic, tcsh-specific history-related settings.

echo "sed -e"|perl -pe 's/sed -e/perl -pe/'
2009-02-16 18:39:06
User: drossman
Functions: echo

Replace sed regular expressions with perl patterns on the command line.

The sed equivalent is: echo "sed -e"|sed -e 's/sed -e/perl -pe/'