What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,591 results
(mountpoint -q "/media/mpdr1" && df /media/mpdr1/* > /dev/null 2>&1) || ((sudo umount "/media/mpdr1" > /dev/null 2>&1 || true) && (sudo mkdir "/media/mpdr1" > /dev/null 2>&1 || true) && sudo mount "/dev/sdd1" "/media/mpdr1")
2014-04-12 11:23:21
User: tweet78
Functions: df mkdir mount sudo umount

In my example, the mount point is /media/mpdr1 and the FS is /dev/sdd1

/mountpoint-path = /media/mpdr1


Why this command ?

Well, in fact, with some external devices I used to face some issues : during data transfer from the device to the internal drive, some errors occurred and the device was unmounted and remounted again in a different folder.

In such situations, the command mountpoint gave a positive result even if the FS wasn't properly mounted, that's why I added the df part.

And if the device is not properly mounted, the command tries to unmount, to create the folder (if it exists already it will also work) and finally mount the FS on the given mount point.

parallel convert {} {.}.png ::: *.svg
2014-04-12 06:39:02

Use GNU Parallel: short, easy to read, and will run one job per core.

sudo tcpdump -i wlan0 -n ip | awk '{ print gensub(/(.*)\..*/,"\\1","g",$3), $4, gensub(/(.*)\..*/,"\\1","g",$5) }' | awk -F " > " '{print $1"\n"$2}'
for w in [WORT1] [WORTn]; do wget -O $w.mp3 $(wget -O - "http://www.duden.de/rechtschreibung/$w" | grep -o "http://www.duden.de/_media_/audio/[^\.]*\.mp3"); done
find . -name \*.svg -print0 | xargs -0 -n1 -P4 -I{} bash -c 'X={}; convert "$X" "${X%.svg}.png"'
2014-04-11 14:30:30
User: flatcap
Functions: bash find xargs

Convert some SVG files into PNG using ImageMagick's convert command.

Run the conversions in parallel to save time.

This is safer than robinro's forkbomb approach :-)

xargs runs four processes at a time -P4

echo thisIsATest | sed -r 's/([A-Z])/_\L\1/g'
2014-04-11 13:36:08
User: flatcap
Functions: echo sed
Tags: sed

Convert a camelCase string into snake_case.

To complement senorpedro's command.

jkhgkjh; until [[ $? -eq 0 ]]; do YOURCOMMAND; done
2014-04-11 08:19:15
User: moiefu

You want bash to keep running the command until it is successful (until the exit code is 0). Give a dummy command, which sets the exit code to 1 then keep running your command until it exits cleanly

< /dev/urandom tr -dc _A-Z-a-z-0-9 | head -c${1:-16};echo;
2014-04-07 10:07:22
User: opexxx
Functions: head tr

This snippet will produce an alpha-numeric 16 character password

find -type f -exec grep -q "regexp" {} \; -delete
2014-04-06 19:06:50
User: gumnos
Functions: find grep
Tags: find grep

Deletes files in the current directory or its subdirectories that match "regexp" but handle directories, newlines, spaces, and other funky characters better than the original #13315. Also uses grep's "-q" to be quiet and quit at the first match, making this much faster. No need for awk either.

grep -Rl "pattern" files_or_dir
2014-04-06 18:18:07
User: N1nsun
Functions: grep
Tags: awk find grep

Grep can search files and directories recursively. Using the -Z option and xargs -0 you can get all results on one line with escaped spaces, suitable for other commands like rm.

find . | xargs grep -l "FOOBAR" | awk '{print "rm -f "$1}' > doit.sh
2014-04-06 15:48:41
User: sergeylukin
Functions: awk find grep xargs
Tags: awk find grep

After this command you can review doit.sh file before executing it.

If it looks good, execute: `. doit.sh`

<CTRL+Z>; fg
2014-04-06 14:21:08
User: kbrotheridge

Saves opening another console terminal (eg. CTRL+ALT+F[n]) or opening another remote terminal.

Ctrl+Z pauses the current task and pushed it to the background, leaving you with a command prompt for those "Oh crap I forgot to change xyz before I ran that and it'll take forever if I Ctrl+C and start again..." situations. Typing 'fg' (shorthand for foreground, that's how I remember it) will resume the paused task.

echo $(sudo lshw -businfo | grep -B 1 -m 1 $(df "/path/to/file" | tail -1 | awk '{print $1}' | cut -c 6-8) | head -n 1 | awk '{print $1}' | cut -c 5- | tr ":" "-") | sudo tee /sys/bus/usb/drivers/usb/unbind
2014-04-06 12:06:29
User: tweet78
Functions: awk cut df echo grep head sudo tail tee tr

You have an external USB drive or key.

Apply this command (using the file path of anything on your device) and it will simulate the unplug of this device.

If you just want the port, just type :

echo $(sudo lshw -businfo | grep -B 1 -m 1 $(df "/path/to/file" | tail -1 | awk '{print $1}' | cut -c 6-8) | head -n 1 | awk '{print $1}' | cut -c 5- | tr ":" "-")

VAR=$(head -5)
2014-04-05 13:45:18
User: rodolfoap
Functions: head
Tags: read stdin head,

Reads n lines from stdin and puts the contents in a variable. Yes, I know the read command and its options, but find this logical even for one line.

command_line 2>&1 | tee -a output_file
for i in {1..256};do p=" $i";echo -e "${p: -3} \\0$(($i/64*100+$i%64/8*10+$i%8))";done|cat -t|column -c120
2014-04-04 16:54:53
User: AskApache
Functions: cat column echo

Prints out an ascii chart using builtin bash! Then formats using cat -t and column.

The best part is:

echo -e "${p: -3} \\0$(( $i/64*100 + $i%64/8*10 + $i%8 ))";

From: http://www.askapache.com/linux/ascii-codes-and-reference.html

echo "this_is_a_test" | sed -r 's/_([a-z])/\U\1/g'
C:\> shutdown /f /r /t 0
2014-04-02 22:35:00
User: mpb
Functions: shutdown

Today, I needed to reboot a Windoze machine on another continent which had no shutdown or restart options via "Start" in the remote desktop (the only options available were: "Logoff, Disconnect, or Lock").

Fortunately, I found how to shutdown and restart from the command line.

hgrep() { ... } longer then 255 characters, see below
2014-04-02 16:40:36
User: Xk2c



if [[ ${#} -eq 0 ]]


printf "usage:\nhgrep [--nonum | -N | -n | --all-nonum | -an | -na] STRING\n"

return 1


while [[ ${#} -gt 0 ]]


case ${1} in

--nonum | -N | -n | --all-nonum | -an | -na)

builtin history | sed 's/^[[:blank:]]\+[[:digit:]]\{1,5\}[[:blank:]]\{2\}//' | grep -iE "(${*:2})"




builtin history | grep -iE "(${*})"






'hgrep -n' helps in using full grep support, e.g. search for _beginning_ of specific commands,

see example output

rename 's/\.sh//' ./*
2014-04-02 16:33:25
User: abhikeny
Functions: rename

The 'rename' command with the first argument as "'s/\.//'" and the second argument as "" will remove the specified extension from the filenames.

function hg () { history | grep $* ; } # define a function combining history and grep to save typing :-)
2014-04-02 15:17:31
User: mpb
Functions: grep

By defining a function "hg" as shown here, it saves me typing "history | grep" every time I need to search my shell history because now I only have to type "hg".

A nifty time saver :-)

You can also add the "hg" function definition to your .bashrc so it is defined each time you login.

(prefix="10.59.21" && for i in `seq 254`; do (sleep 0.5 && ping -c1 -w1 $prefix.$i &> /dev/null && arp -n | awk ' /'$prefix'.'$i' / { print $1 " " $3 } ') & done; wait)
2014-04-02 11:20:57
User: smoky
Functions: arp awk ping sleep
Tags: ping

Waits for all pings to complete and returns ip with mac address

diff -qr /dirA /dirB
rsync -avz --dry-run /somewhere/source_directory /somewhereelse/target_directory
2014-04-01 20:55:59
User: tsener
Functions: rsync
Tags: diff rsync

--dry-run will only show you which files would be otherwise synced with rsync.

-z is for compressio

-v vervose

-a "as is" - permissions, ownership etc.

for output in $(find . ! -name movie.nfo -name "*.nfo") ; do rm $output ; done
2014-04-01 17:41:50
User: analbeard
Functions: find rm

Finds all nfo files without the filename movie.nfo and deletes them.