What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

All commands from sorted by
Terminal - All commands - 12,364 results
ps auxw | grep -E 'sbin/(apache|httpd)' | awk '{print"-p " $2}' | xargs strace -F
2016-08-04 10:59:58
User: gormux
Functions: awk grep ps strace xargs
Tags: awk grep ps strace

Will open strace on all apache process, on systems using sbin/apache (debian) or sbin/httpd (redhat), and will follow threads newly created.

gem install `ruby ./isuckat_ruby.rb 2>&1 | sed -e 's/.*find gem .//g' -e 's/ .*//g' | head -n 1`
2016-08-03 19:41:27
User: operat0r
Functions: head install sed

When bundle install sucks ...This runs isuckat_ruby.rb and when stderror matches find gem ' it will gem install what ever is missing ...

sed 'X{N;s/\n//;}' file.txt (where X is the current line)
2016-07-30 14:27:20
User: pibarnas
Functions: sed

N: On the current line, sed will display it on pattern space, plus a \n (new line); but

s/\n//: Will get rid of new line displayed on pattern space, joining the current line's end with the start of the next line

Useful in scripts.

links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
2016-07-26 12:54:53
User: mogoh
Functions: grep head sort uniq

sort -R randomize the list.

head -n1 takes the first.

links $( a=( $( lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort | uniq ) ) ; amax=${#a[@]} ; n=$(( `date '+%s'` % $amax )) ; echo ${a[n]} )
2016-07-26 11:52:12
User: pascalv
Functions: echo grep sort uniq

Access a random news web page on the internet.

The Links browser can of course be replaced by Firefox or any modern graphical web browser.

2016-07-25 18:52:42
User: malathion

Replaces the first instance of 'foo' with 'bar'. To replace all instances of 'foo' with 'bar': !!:gs/foo/bar/

f=correlation.png; s=400; t=50; a=1.2; convert $f -resize ${s}% -threshold ${t}% bmp:- | potrace -o ${f%.*}.svg -b svg -z black --fillcolor "#FFFFFF" --alphamax ${a}
2016-07-25 10:45:28
User: T4b

Uses ImageMagick and potrace to vectorize the input image, with parameters optimized for xkcd-like pictures.

while true ; do wget --quiet --no-check-certificate --post-data 'login=my_account_number&password=my_password&submit=Valider' 'https://wifi.free.fr/Auth' -O '/dev/null' ; sleep 600; done
2016-07-23 16:34:42
User: pascalv
Functions: sleep true wget
Tags: wifi France

(In French) Connection aux hotspots FreeWifi, et maintien de la connection active

hpacucli controller all show config detail | grep -A 7 Fail | egrep '(Failed|Last|Serial Number|physicaldrive)'
2016-07-20 17:42:40
User: operat0r
Functions: egrep grep

This dumps serial numbers of all the drives but HP warranty check does not say they are valid ...

tree -i -L 1
curl -s http://whatismyip.org/ | grep -oP '(\d{1,3}\.){3}\d+'
lsa() { ls -lart; history -s "joe \"$(\ls -apt|grep -v /|head -1)\"" ; }
2016-07-07 21:27:55
User: knoppix5
Functions: ls

Display recursive file list (newest file displayed at the end) and be free to access last file in the list simply by pressing arrow_up_key i.e. open it with joe editor.

BTW IMHO the list of files with newest files at the end is often more informative.

Put this 'lsa' function somewhere in your .bashrc and issue

. ~/.bashrc


source ~/.bashrc

to have access to the 'lsa' command immediately.


(the function appends command "joe last_file_in_the_list" at the end of command history)

while true; do A=$(stat -c%s FILE); sleep 1; B=$(stat -c%s FILE); echo -en "\r"$(($B-$A))" Bps"; done
2016-06-27 20:39:06
User: Zort
Functions: echo sleep stat

Muestra el crecimiento de un archivo por segundo.

Cambia el texto "FILE" por el nombre del archivo a monitorear.

Comando STAT

while true; do A=$(ls -l FILE | awk '{print $5}'); sleep 1; B=$(ls -l FILE | awk '{print $5}'); echo -en "\r"$(($B-$A))" Bps"; done
2016-06-27 20:33:02
User: Zort
Functions: awk echo ls sleep

Muestra el crecimiento de un archivo por segundo.

Cambia el texto "FILE" por el nombre del archivo a monitorear.

Comando LS + AWK

netstat -n | grep ESTAB |grep :80 | tee /dev/stderr | wc -l
2016-06-26 11:37:19
User: rubenmoran
Functions: grep netstat tee wc

Summarize established connections after netstat output.

Using tee and /dev/stderr you can send one command output to terminal before executing wc so you can summarize at the bottom of the output.

sed '/INSERT INTO `unwanted_table`/d' mydb.sql > reduced.sql
2016-06-24 20:13:47
User: sudopeople
Functions: sed

Starting with a large MySQL dump file (*.sql) remove any lines that have inserts for the specified table. Sometimes one or two tables are very large and uneeded, eg. log tables. To exclude multiple tables you can get fancy with sed, or just run the command again on subsequently generated files.

less -X /var/log/insecure
2016-06-24 13:53:49
User: sonic
Functions: less

From the manpage:

man less

-X or --no-init

Disables sending the termcap initialization and deinitialization strings to the terminal. This is sometimes desirable if the deinitialization string does something unnecessary, like clearing the screen.


If you want to clear the screen after viewing a file this way that had sensitive information, hit or just type clear. Since is readily available, I don't know why less bothers to automatically clear. If you're viewing the file at all, chances are you want to see the output from it after you quit.

echo '"\e\C-i": "\C-awhile true; do ( \C-e ); inotifywait -q -e modify -e close_write *; done\e51\C-b"' >>~/.inputrc
2016-06-10 08:06:50
User: unhammer
Functions: echo

Assuming you've written all of

make -j hfst-tokenize && echo doavtter gr?dakursa|./hfst-tokenize --gtd tokeniser-gramcheck-gt-desc.pmhfst

and want that to execute every time you :w in vim (or C-xC-s in Emacs), just hit and it'll turn into

while true; do ( make -j hfst-tokenize && e doavtter gr?dakursa|./hfst-tokenize --gtd tokeniser-gramcheck-gt-desc.pmhfst ); inotifywait -q -e modify -e close_write *; done

with the cursor right before the ')'. Hit enter, and it'll run on each save.

Requires the package inotify-tools installed.

echo "https://www.google.com/maps/place/$(exiftool -ee -p '$gpslatitude, $gpslongitude' -c "%d?%d'%.2f"\" image.jpg 2> /dev/null | sed -e "s/ //g")"
rsync -a --delete empty-dir/ target-dir/
2016-06-07 16:56:55
User: malathion
Functions: rsync
Tags: delete rsync

This command works by rsyncing the target directory (containing the files you want to delete) with an empty directory. The '--delete' switch instructs rsync to remove files that are not present in the source directory. Since there are no files there, all the files will be deleted.

I'm not clear on why it's faster than 'find -delete', but it is.

Benchmarks here: https://web.archive.org/web/20130929001850/http://linuxnote.net/jianingy/en/linux/a-fast-way-to-remove-huge-number-of-files.html

sudo mount -o remount,rw / && sudo cp /etc/hosts /etc/hosts.old && wget http://winhelp2002.mvps.org/hosts.txt && cp /etc/hosts ~/ && cat hosts.txt >> hosts && sudo cp hosts /etc/hosts
2016-06-06 15:01:19
User: bugmenot
Functions: cat cp mount sudo wget

Will append lines to the hosts file to do some basic ad blocking.

curl --silent --head "${url}" | grep 'Last-Modified:' | cut -c 16- | date -f - +'%s'
2016-06-02 22:20:55
User: odoepner
Functions: cut date grep

This command line assumes that "${url}" is the URL of the web resource.

It can be useful to check the "freshness" of a download URL before a GET request.

curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
tree -isafF /var|grep -v "/$"|tr '[]' ' '|sort -k1nr|head
du -a /var | sort -n -r | head -n 10