What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

All commands from sorted by
Terminal - All commands - 12,362 results
cat <( command1 arg arg ) <( command2 arg ) ...
2009-03-07 04:33:12
User: Pistos
Functions: cat

Concatenate the stdout of multiple commands.

ls -1 *.part1.rar | xargs -d '\n' -L 1 unrar e
find /usr/lib -maxdepth 1 -type l -print0 | xargs -r0 du -Lh
2009-03-07 00:17:45
User: starchox
Functions: du find xargs

You also can sum the file usage of all files

find /usr/lib -maxdepth 1 -type l -print0 | xargs -r0 du -Lch
INFILE=/path/to/your/backup.img; MOUNTPT=/mnt/foo; PARTITION=1; mount "$INFILE" "$MOUNTPT" -o loop,offset=$[ `/sbin/sfdisk -d "$INFILE" | grep "start=" | head -n $PARTITION | tail -n1 | sed 's/.*start=[ ]*//' | sed 's/,.*//'` * 512 ]

Suppose you made a backup of your hard disk with dd:

dd if=/dev/sda of=/mnt/disk/backup.img

This command enables you to mount a partition from inside this image, so you can access your files directly.

Substitute PARTITION=1 with the number of the partition you want to mount (returned from sfdisk -d yourfile.img).

ttmkfdir mkfontdir fc-cache /usr/share/fonts/miscttf
2009-03-06 21:28:17
User: starchox
Functions: fc-cache

First you have to create a directory in your system, where the fonts will be stored, and copy them.

sudo mkdir /usr/share/fonts/miscttf; sudo cp *.ttf /usr/share/fonts/miscttf

After recharge cache with the command

ps -o %mem= -C firefox-bin | sed -s 's/\..*/%/'
for x in `find /path/ -type d | cut -b bytesoffoldername-`; do mkdir -p newpath/$x; done
ls -ltcrh
tcptraceroute www.google.com
some_cronjobed_script.sh 2>&1 | tee -a output.log | grep -C 1000 ERROR
2009-03-06 17:51:13
User: DEinspanjer
Functions: grep tee
Tags: Linux

The large context number (-C 1000) is a bit of a hack, but in most of my use cases, it makes sure I'll see the whole log output.

cat $(ls -tr | tail -1) | awk '{ a[$1] += 1; } END { for(i in a) printf("%d, %s\n", a[i], i ); }' | sort -n | tail -25
2009-03-06 17:50:29
User: oremj
Functions: awk cat ls sort tail

This command is much quicker than the alternative of "sort | uniq -c | sort -n".

du -sh *
tail -f *[!.1][!.gz]
2009-03-06 16:24:44
User: piscue
Functions: tail

with discard wilcards in bash you can "tail" newer logs files to see what happen, any error, info, warn...

hdiutil makehybrid -udf -udf-volume-name DVD_NAME -o MY_DVD.iso /path/
2009-03-06 15:45:59
User: occam
Tags: Os X macosx

/path/ is the root folder of the DVD, not the VIDEO_TS folder.

echo 1 2 3 > FILE; while read -a line; do echo ${line[2]}; done < FILE
2009-03-06 15:32:40
User: occam
Functions: echo read
Tags: bash

This will print out the third column of every line in FILE. Useful for many files in /proc or *csv data.

vifind() { vi `find . -name "$1"` }
cd() { builtin cd "${@:-$HOME}" && ls; }
2009-03-05 22:37:35
User: haivu
Functions: cd

Often, the very next command after the cd command is 'ls', so why not combine them?. Tested on a Red Hat derivative and Mac OS X Leopard

Update: changed ${1:-$HOME} to "${@:-$HOME}" to accomodate directories with spaces in the names

find . -type d \( -name DIR1 -o -name DIR2 \) -prune -o -type f -print0 | xargs -r0 md5sum
2009-03-05 21:26:24
User: starchox
Functions: find xargs
Tags: bash

Useful if you want get all the md5sum of files but you want exclude some directories. If your list of files is short you can make in one command as follow:

find . -type d \( -name DIR1 -o -name DIR2 \) -prune -o -type f -exec md5sum {} \;

Alternatively you can specify a different command to be executed on the resulting files.

alias lh='ls -a | egrep "^\."'
/usr/proc/bin/pfiles $PID
2009-03-05 17:26:57
User: axelabs

Report fstat(2) and fcntl(2) information for all open files in each process.

SUM=0; for FILESIZE in `find /tmp -type f -iname \*pdf -exec du -b {} \; 2>/dev/null | cut -f1` ; do (( SUM += $FILESIZE )) ; done ; echo "sum=$SUM"
2009-03-05 17:16:52
User: alcik
Functions: cut du echo
Tags: find du

This example summarize size of all pdf files in /tmp directory and its subdirectories (in bytes).

Replace "/tmp" with directory path of your choice and "\*pdf" or even "-iname \*pdf" with your own pattern to match specific type of files. You can replace also parameter for du to count kilo or megabytes, but because of du rounding the sum will not be correct (especially with lot of small files and megabytes counting).

In some cases you could probably use sth like this:

du -cb `find /tmp -type f -iname \*pdf`|tail -n 1

But be aware that this second command CANNOT count files with spaces in their names and it will cheat you, if there are some files matching the pattern that you don't have rights to read. The first oneliner is resistant to such problems (it will not count sizes of files which you cant read but will give you correct sum of rest of them).

touch /tmp/$$;for N in `seq -w 0 7777|grep -v [89]`; do chmod $N /tmp/$$; P=`ls -l /tmp/$$ | awk '{print $1}'`; echo $N $P; done;rm /tmp/$$
grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}' OR USE ALTERNATE WAY awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'
2009-03-05 07:28:26
User: rommelsharma
Functions: awk grep

grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}'


awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'

In this example, we are having a text file that is having several entries like:


c1 c2 c3 c4

this is some data

HOME /dir1/dir2/.../dirN/somefile1.xml

HOME /dir1/dir2/somefile2.xml

some more data


for lines starting with HOME, we are extracting the second field that is a 'file path with file name', and from that we need to get the filename only and ignore the slash delimited path.

The output would be:



(In case you give a -ive - pls give the reasons as well and enlighten the souls :-) )

mogrify -resize 800\> *