What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



All commands from sorted by
Terminal - All commands - 12,418 results
jot -b '#' -s '' $COLUMNS
2010-04-13 22:03:39
User: dennisw
Tags: tr tput printf

For BSD-based systems, including OS X, that don't have seq.

This version provides a default using tput in case $COLUMNS is not set:

jot -b '#' -s '' ${COLUMNS:-$(tput cols)}
sed -i 's/[ \t]\+$//g' file.txt
2011-09-07 01:47:44
User: elder
Functions: sed
Tags: sed regex

This command is useful when you are programming, for example.

net user USERNAME /domain
wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
2009-07-02 01:46:21
User: bbelt16ag
Functions: wget

just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.

cat /proc/cpuinfo
ash prod<tab>
2012-05-12 19:51:02
User: c3w


. a Ruby SSH helper script

. reads a JSON config file to read host, FQDN, user, port, tunnel options

. changes OSX Terminal profiles based on host 'type'


put 'ash' ruby script in your PATH

modify and copy ashrc-dist to ~/.ashrc

configure OSX Terminal profiles, such as "webserver", "development", etc

run "ash myhostname" and away you go!

v.2 will re-attach to a 'screen' named in your ~/.ashrc

curl -k https://Username:[email protected]/v1/posts/all?red=api | xml2| \grep [email protected]' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
2013-05-04 17:43:21
User: bbelt16ag
Functions: cut sort uniq

This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive

find . -name "*.php" -exec grep -il searchphrase {} \;
2010-01-16 05:09:30
Functions: find grep

This is very similar to the first example except that it employs the 'exec' argument of the find command rather than piping the result to xargs. The second example is nice and tidy but different *NIXs may not have as capable a grep command.

du -s * | sort -nr | head
ffmpeg -r 12 -i img%03d.jpg -sameq -s hd720 -vcodec libx264 -crf 25 OUTPUT.MP4
find . -type f -exec grep -qi 'foo' {} \; -print0 | xargs -0 vim
2009-09-03 17:55:26
User: arcege
Functions: find grep xargs
Tags: vim find grep

Make sure that find does not touch anything other than regular files, and handles non-standard characters in filenames while passing to xargs.

pear config-set http_proxy http://myusername:[email protected]:8080
2010-05-13 14:44:03
User: KoRoVaMiLK

Useful since

"export http_proxy=blahblah:8080"

doesn't seem to work with pear

aptitude show $PROGRAM | grep Vers
2009-02-27 23:24:37
User: aabilio
Functions: grep

Output: Version 3.2-0 (for example if you type # aptitude show bash | grep Vers

Depends on the language of your distribution, because the name of the word "Version" in other languages may be different.

xrandr -q | grep -w Screen
file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
2010-09-24 19:44:32
User: damncool
Functions: file sed seq

splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ...

The psselect commad is part of the psutils package

watch -n 10 free -m
2014-01-04 10:10:15
User: Darkstar
Functions: free watch

This command shows a high level overview of system memory and usage refreshed in seconds. Change -n 10 to you desired refresh interval.

readlink -f /proc/<pid>/cmdline
2009-05-26 10:09:03
User: naseer
Functions: readlink

Uses the pid to get the full path of the process. Useful when you do not which command got picked from the path

for each in *; do file="$each."; name=${file%%.*}; suffix=${file#*.}; mv "$each" "$(echo $name | rot13)${suffix:+.}${suffix%.}"; done
2010-03-20 16:11:12
User: hfs
Functions: mv

This got a bit complicated, because I had to introduce an additional dot at the end that has to be removed again later.

ls --color=never -1| grep -E "[0-9]{4}"|sed -re "s/^(.*)([0-9]{4})(.*)$/\2 \1\2\3/" | sort -r
netstat -4tnape
wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
2010-01-17 11:25:47
User: oshazard
Functions: cd sed tar wget

Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion.

1.) wget source.tar.gz

2.) tar xzvf source.tar.gz

3.) cd source

4.) ls

From there you can run ./configure, make and etc.

split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
2010-03-20 16:49:19
User: tboulay
Functions: echo split

This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt.

I usually encrypt the file before splitting it using openssl:

openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3

To restore, simply save attachments and rejoin them using:

cat file.tgz.* > output_name.tgz

and if encrypted, decrypt using:

openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz

edit: (changed "g" to "e" for political correctness)

if [ -x /etc/*-release ]; then cat /etc/*-release ; else cat /etc/*-version ; fi
rsync -P -e 'ssh -p PORT' SRC DEST
2011-10-13 08:59:07
User: vickio
Functions: rsync
Tags: ssh rsync

Transfer files with rsync over ssh on a non-standard port, showing a progress bar and resuming partial transfers.