What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

All commands from sorted by
Terminal - All commands - 12,379 results
echo "nohup command rm -rf /phpsessions 1>&2 &>/dev/null 1>&2 &>/dev/null&" | at now + 3 hours 1>&2 &>/dev/null
2009-08-18 07:31:17
User: AskApache
Functions: at echo

This is helpful for shell scripts, I use it in my custom php install script to schedule to delete the build files in 3 hours, as the php install script is completely automated and is made to run slow.

Does require at, which some environments without crontab still do have.

You can add as many commands to the at you want. Here's how I delete them in case the script gets killed. (trapped)

atq |awk '{print $1}'|xargs -iJ atrm J &>/dev/null

nohup /bin/sh myscript.sh 1>&2 &>/dev/null 1>&2 &>/dev/null&
2009-08-18 07:24:52
User: AskApache
Functions: nohup

This command runs your shell script in the background with no output of any kind, and it will remain running even after you logout.

eval $(sed -n "s/^d[^D]*DB_\([NUPH]\)[ASO].*',[^']*'\([^']*\)'.*/_\1='\2'/p" wp-config.php) && mysqldump --opt --add-drop-table -u$_U -p$_P -h$_H $_N | gpg -er AskApache >`date +%m%d%y-%H%M.$_N.sqls`
2009-08-18 07:03:08
User: AskApache
Functions: eval gpg sed

The coolest way I've found to backup a wordpress mysql database using encryption, and using local variables created directly from the wp-config.php file so that you don't have to type them- which would allow someone sniffing your terminal or viewing your shell history to see your info.

I use a variation of this for my servers that have hundreds of wordpress installs and databases by using a find command for the wp-config.php file and passing that through xargs to my function.

find / -name "*.pdf" -exec cp -t ~/Documents/PDF {} +
2009-08-18 06:11:35
Functions: cp find
Tags: find cp for

I used this to copy all PDFs recursively to a selected dir

sudo du -ks $(ls -d */) | sort -nr | cut -f2 | xargs -d '\n' du -sh 2> /dev/null
2009-08-17 22:21:09
User: Code_Bleu
Functions: cut du ls sort sudo xargs
Tags: disk usage

This allows the output to be sorted from largest to smallest in human readable format.

convmv -r -f ISO-8859-1 -t UTF-8 --notest *
2009-08-17 18:23:19
User: jsiei97

Nothing advanced, it just finds filenames that are stored with ISO-8859-1 characters and and converts those into UTF-8. Recommended to use without the --notest flag first so you can see what will be changed.

urls=('www.ubuntu.com' 'google.com'); for i in ${urls[@]}; do http_code=$(curl -I -s $i -w %{http_code}); echo $i status: ${http_code:9:3}; done
wget -q -O- PAGE_URL | grep -o 'WORD_OR_STRING' | wc -w
mysql -u<user> -p<password> -s -e 'DESCRIBE <table>' <database> | tail -n +1 | awk '{ printf($1",")}' | head -c -1
(cd SRC; find . -type d -exec mkdir TARGET/{} ";"; find . -type f -exec mv {} TARGET/{} ";")
2009-08-17 12:35:48
User: karel1980
Functions: cd find mkdir mv

Using a GUI file managers you can merge directories (cut and paste). This command roughly does the same (it doesn't ask for confirmation (no problem for me) and it doesn't clean up the empty SRC directories (no problem, trivial).

probably does the same:

cp -l SRC TARGET; rm -rf SRC
ps aux | grep [c]ommandname
SELECT relname, reltuples, pg_relation_size(relname) FROM pg_class r JOIN pg_namespace n ON (relnamespace = n.oid) WHERE relkind = 'r' AND n.nspname = 'public' ORDER BY relname;
2009-08-17 11:46:34
User: alvinx

Postgresql specific SQL

- to show count of ALL tables including relation-size (pg_relation_size = used space on filesystem)

- might need a VACUUM ANALYZE before showing all counts correctly !

$ vim ... :help 42
2009-08-17 11:37:02
User: alvinx
Functions: vim

inside vim try:

:help 42

to get the meaning of life, the universe and everything !

awk < file.name '{ system("resolveip -s " $1) }'
2009-08-17 08:09:39
Functions: awk

Given a file of FQDN, this simple command resolves the IP addresses of those Useful for log files or anything else that outputs domain names.

alien -r -c file.deb
2009-08-17 05:53:15

converts between Red Hat rpm, Debian deb, Stampede slp, Slackware tgz, and Solaris pkg file formats ... It also supports LSB packages.

rpm -ivh 'http://www.website.com/path/to/desired_software_package.rpm'
2009-08-17 03:56:05
User: matthewbauer
Tags: wildcard

This is exactly the same as a wildcard - good for times when wildcards are disabled and when you want have a wildcard of a directory that is not your current ({`ls /path/to/dir`}). Does not work on older versions of Bash though.

for f in *;do case "$(echo $f|sed "s/.*\.\([a-z\.]*\)/\1/g")" in zip)unzip -qqo $f&&rm $f;;tar.gz|tar.bz2)tar xf $f&&rm $f;;rar)unrar e -o+ -r -y $f&&rm $f;;7z)7z e -qqo $f;;esac;done
2009-08-17 03:50:50
User: matthewbauer

This will unarchive the entire working directory. Good for torrents (I don't know why they put each file into a seperate archive).

diff <(cd /path-1; find . -type f -print | egrep -i '\.m4a$|\.mp3$') <(cd /path-2; find . f -print | egrep -i '\.m4a$|\.mp3$')
2009-08-17 00:49:31
User: drewk
Functions: cd diff egrep find

diff is designed to compare two files. You can also compare directories. In this form, bash uses 'process substitution' in place of a file as an input to diff. Each input to diff can be filtered as you choose. I use find and egrep to select the files to compare.

file -i * | grep -c 'text/plain'
file -i * | grep 'text/plain' | wc -l
2009-08-16 21:22:46
User: voyeg3r
Functions: file grep wc

get files without extensions, get ASCII and utf-8 as "text/plain"

date -j -v +1000000000S -f %m%d%Y mmddYYYY
date -j -v +1000000000S -f %m%d%Y mmddyyyy
:for i in range(1,255) | .put='192.168.0.'.i | endfor
:%s/^/\=line('.').' '
2009-08-16 17:48:03
User: voyeg3r

Use this command to insert line numbers in source files, .' ' control how spaces you insert after number.