What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,585 results
alias oath='temp=$(pbpaste) && oathtool --base32 --totp "YOUR SEED HERE" | pbcopy && sleep 3 && echo -n $temp | pbcopy'
2014-03-14 19:21:18
Functions: alias echo sleep

Typing a word in terminal is easier than digging your phone out, opening your two-factor authentication app and typing the code in manually.

This alias copies the one-time code to your clipboard for 3 seconds (long enough to paste it into a web form), then restores whatever was on the clipboard beforehand.

This command works on Mac. Replace pbpaste/pbcopy with your distribution's versions.

curl btc.cm/last
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && echo -n `ls -s $gz` "... " && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz && echo `ls -s $d/$f.bz2`; done
2014-03-13 08:36:24
User: pdwalker
Functions: bzip2 echo gunzip rm

- recompresses all gz files to bz2 files from this point and below in the directory tree

- output shows the size of the original file, and the size of the new file. Useful.

- conceptually easier to understand than playing tricks with awk and sed.

- don't like output? Use the following line:

for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
tar -cJf myarchive.tar.xz /path/to/archive/
2014-03-13 03:34:18
User: Sepero
Functions: tar

Compress files or a directory to xz format. XZ has superior and faster compression than bzip2 in most cases. XZ is superior to 7zip format because it can save file permissions and other metadata data.

2014-03-12 18:00:21
User: pdxdoughnut
Functions: ls xargs

xargs will automatically determine how namy args are too many and only pass a reasonable number of them at a time. In the example, 500,002 file names were split across 26 instantiations of the command "echo".

ls | grep ".txt$" | xargs -i WHATEVER_COMMAND {}
/usr/bin/lynx -dump -width 500 | grep GET | awk '{print $12 $14}' | sort | uniq -c | sort -rn | head
2014-03-12 13:24:40
User: copocaneta
Functions: awk grep sort uniq

List the busiest scripts/files running on a cPanel server with domain showing (column $12).

netstat -tn 2>/dev/null | grep ':80 ' | awk '{print $5}' |sed -e 's/::ffff://' | cut -f1 -d: | sort | uniq -c | sort -rn | head
2014-03-12 12:43:07
User: copocaneta
Functions: awk cut grep netstat sed sort uniq

IP addresses and number of connections connected to port 80.

/usr/bin/lynx -dump -width 500 | awk 'BEGIN { FS = " " } ; { print $12 }' | sed '/^$/d' | sort | uniq -c | sort -n
/usr/bin/lynx -dump -width 500 | grep GET | awk '{print $12}' | sort | uniq -c | sort -rn | head
2014-03-12 12:31:34
User: copocaneta
Functions: awk grep sort uniq

Easiest way to obtain the busiest website list (sorted by number of process running).

find . -name "*.txt" -exec WHATEVER_COMMAND {} \;
num_errs=`grep ERROR /var/log/syslog | tee >(cat >&2) | wc -l`
2014-03-12 00:04:24
Functions: cat tee wc

Many circumstances call for creating variable of a summary result while still printing the original pipe. Inserting "tee >(cat >&2)" allows the command output to still be printed while permitting the same output to be processed into a variable.

lsof|gawk '$4~/txt/{next};/REG.*\(deleted\)$/{printf ">/proc/%s/fd/%d\n", $2,$4}'
2014-03-11 10:40:32
User: wejn
Functions: gawk
Tags: awk lsof gawk

While the posted solution works, I'm a bit uneasy about the "%d" part. This would be hyper-correct approach:

lsof|gawk '$4~/txt/{next};/REG.*\(deleted\)$/{sub(/.$/,"",$4);printf ">/proc/%s/fd/%s\n", $2,$4}'

Oh, and you gotta pipe the result to sh if you want it to actually trim the files. ;)

Btw, this approach also removes false negatives (OP's command skips any deleted files with "txt" in their name).

lsof | grep -i deleted | grep REG | grep -v txt | ruby -r 'pp' -e 'STDIN.each do |v| a = v.split(/ +/); puts `:> /proc/#{a[1]}/fd/#{a[3].chop}`; end'
2014-03-11 06:02:09
User: jim80net
Functions: grep

Be careful, first run:

lsof | grep -i deleted | grep REG | grep -v txt

Then, give it the boot!

killall conky
find . \( -iname "*.doc" -o -iname "*.docx" \) -type f -exec ls -l --full-time {} +|sort -k 6,7
ffmpeg -i $video -c:v prores -profile:v 2 -c:a copy ${video}.mov
cat skype_log | sed -s 's/\(\[.*\]\) \(.*\): \(.*\)/<\2> \3/'
read -p "Please enter the 4chan url: "|egrep '//i.4cdn.org/[a-z0-9]+/src/([0-9]*).(jpg|png|gif)' - -o|nl -s https:|cut -c7-|uniq|wget -nc -i - --random-wait
trash-put junkfolder
2014-03-09 00:24:09
User: Sepero
Tags: trash

apt-get install trash-cli

Commandline program that allows you put folders or files in the standard KDE/Unity desktop trash.

inotifywait -mr -e CREATE $HOME/bin/ | while read i; do chmod +x $(echo "$i" | sed 's/ \S* //'); done
/opt/homebrew-cask/Caskroom/vlc/2.1.0/VLC.app/Contents/MacOS/VLC --sout-avcodec-strict=-2 -I dummy $video :sout="#transcode{vcodec=h264,vb=1024,acodec=mpga,ab=256,scale=1,channels=2,audio-sync}:std{access=file,mux=mp4,dst=${video}.m4v}" vlc://quit
2014-03-08 13:53:13
User: brainstorm

Transcodes mpg2 files, that cannot be read by iMovie to m4v files.

sudo lshw -html > /tmp/hardware.html && xdg-open /tmp/hardware.html
2014-03-08 10:40:21
User: Sadi
Functions: sudo

entering this command as root may give more complete results, creating a tmp file, and immediately opening the file might be more useful

grep -r --include=*.php "something" /foo/bar
2014-03-07 13:26:12
User: avpod2
Functions: grep

Actually grep can do recursive search based on file extensions.

ffmpeg -re -i localFile.mp4 -c copy -f flv rtmp://server/live/streamName
2014-03-07 05:52:11
User: unni9946

This command is used to stream a video file as live to some streaming server like Wowza, Red5 . etc