What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,927 results
head -c20 /dev/urandom | xxd -ps
2013-07-16 10:14:21
User: opexxx
Functions: head
Tags: HEAD urandom

20characters long alpahnumeric "password"

tar -cvf bind9-config-`date +%s`.tar *
2014-10-29 05:15:15
User: Fuonum
Functions: tar

backup your files in tar archive + timestamp of backup

ssh -l <username> <server>
while [ 1 ]; do echo -e "220 ProFTPD 1.3.3c Server [ProFTPD] \nFAILED FTP ATTEMPT - PORT 21" | nc -vvv -l 21 >> /var/log/honeylog.log 2>> /var/log/honeylog.log; done
2013-07-16 19:05:37
User: xmuda
Functions: echo

[root@dhcppc1 windows]# cat /var/log/honeylog.log

Connection from port 21 [tcp/ftp] accepted

Connection from port 21 [tcp/ftp] accepted

[root@dhcppc1 windows]# nc 21

220 ProFTPD 1.3.3c Server [ProFTPD]


*You can not run it if you have activated the ftp server.

cleartool co -nc `cleartool ls -recurse | grep "hijacked" | sed s/\@\@.*// | xargs`
curl http://www.spam.la/?f=sender | grep secs| awk '{print; exit}' | osd_cat -i 40 -d 30 -l 2
2009-11-12 21:33:06
User: m33600
Functions: awk grep

I have a custmer's Geovision DVR installed on a closed proxi (only logme-in reaches it).

I have to check for reliability but logmein hangs and is too slow a process

I made the Geovision software send e-mail every minute to the www.spam.la site.

All this script does is to retrieve the e-mail header from spam.la ( no login!), filtering sender, stopping at the first occurrence of the word "secs" ( the age of the last e-mail ).

The result is the age of the sender's last e-mail, tiny published on top of my screen once a minute.

I can refresh www.spam.la via web browser, but have other things to do.

I use it inside Kalarm ( kde task schedule ) set to 1 minute repeat.

It can be done without kalarm, using Watch outside the script.

Try it out now using my account = geo1 ( change sender by geo1 in this script)

Needs curl , osd-bin

echo $?
2011-07-27 15:34:20
User: lucasrangit
Functions: echo

I often find it useful to know what the exit status for a program was. This can be helpful when looking up errors by exit status or when scripting frequent commands.

Taken from http://www.faqs.org/docs/abs/HTML/exit-status.html

zenity --list --width 500 --height 500 --column 'Wallpapers' $(ls) | xargs xsetbg -center -smooth -fullscreen
2011-11-15 02:44:48
User: TheShadowFog
Functions: xargs

Assuming you have zenity installed, and assuming that you keep your backgrounds in ~/backgrounds, then this should work for you! :)

find . -maxdepth 2 -type d -name '.git' -print0 | while read -d ''; do (cd "$REPLY"; git gc); done
2012-11-07 08:38:33
User: unhammer
Functions: cd find read
Tags: git drivespace

Assumes you've cd'd to the folder in which all your git repos reside; you could run it from ~ without -maxdepth, although that might make find take quite a while longer.

If you have several processor cores, but not that much ram, you might want to run

git config --global pack.threads 1

first, since gc-ing can eat lots of ram.

ct mkelem -nc `find ./ -name "*" | xargs`
export PS1="C:\\>"; clear
2011-06-18 17:52:42
User: ThePCKid
Functions: export

Is there somebody that uses Windows a lot that keeps messing up your Linux machine? Press Ctrl+Alt+F1-F6 and run this command after logging into a text shell!

hl-nonprinting () { local C=$(printf '\033[0;36m') R=$(printf '\033[0m'); sed -e "s/\t/${C}&#9657;&$R/g" -e "s/$/${C}&#8267;$R/";}
2012-11-07 09:55:48
User: unhammer
Functions: printf sed

I don't think it's possible to give a (background) colour to the tab itself, since a tab is, IIUC, simply a command to the terminal to move to the right. Nevertheless, this "highlighting" can be helpful when working with tab-separated files.

dmesg | grep -Po 'csum failed ino\S* \d+' | sort | uniq | xargs -n 3 find / -inum 2> /dev/null
2014-03-20 06:27:15
User: Sepero
Functions: dmesg find grep sort uniq xargs
Tags: find inode btrfs

Btrfs reports the inode numbers of files with failed checksums. Use `find` to lookup the file names of those inodes.

gst-launch v4l2src ! aasink
2009-02-18 22:17:53
User: rubenrua

Use gstreamer to capture v4l2:///dev/video0 and show ascii art video in display.

find | egrep "\.(ade|adp|bat|chm|cmd|com|cpl|dll|exe|hta|ins|isp|jse|lib|mde|msc|msp|mst|pif|scr|sct|shb|sys|vb|vbe|vbs|vxd|wsc|wsf|wsh)$"
2010-11-23 16:53:55
User: poulter7
Functions: egrep find

Returns any file in the folder which would be rejected by Gmail, if you were to send zipped version.

(Yes, you could just zip it and knock the extension off and put it back on the other side, but for some people this just isn't a solution)

echo -e "Berlin Date/Time is" `TZ=GMT-2 /bin/date \+%c`
hl-nonprinting () { local C=$(printf '\033[0;36m') B=$(printf '\033[0;46m') R=$(printf '\033[0m') np=$(env printf "\u00A0\uFEFF"); sed -e "s/\t/${C}&#9657;&$R/g" -e "s/$/${C}&#8267;$R/" -e "s/[$np]/${B}& $R/g";}
2012-11-07 10:09:40
User: unhammer
Functions: env printf sed

Can't see it here, but the non-breaking space is highlighted :)

Of course,

cat -t -e

achieves something similar, but less colourful.

Could add more code points from https://en.wikipedia.org/wiki/Space_%28punctuation%29#Spaces_in_Unicode

perl -i -ne 'print uc $_' $1
find * ! -name abc | xargs rm
cat $HISTFILE | grep command
find -type f -exec ffmpeg -i "{}" "{}".mp3 \;
svcadm disable cde-login
lpr file
2011-05-05 21:44:43
User: hutch
Functions: lpr

Saves you an

open file

and CTRL+P

So simple and time-saving.

svn merge -r 1337:1336 PATH PATH
2009-02-06 00:48:17
User: troelskn
Functions: merge

Reverts the changes that were made in a particular revision, in the local working copy. You must commit the local copy to the repository to make it permanent.

This is very useful for undoing a change.

You can revert multiple changes by specifying numbers wider apart; Just remember to put the highest number first.

S='<iframe src=\"http:\/\/\/bad\/index.php\" width=\"1\" height=\"1\" frameborder=\"0\"><\/iframe>' && R=''; find . -name "*.html" -exec grep -l "$S" {} \; | xargs sed -i -e "s/$S/$R/g"
2010-04-12 21:45:16
User: rexington
Functions: find grep sed xargs

Removes the given string from all files under the given path - in this case the path given is "." This demonstrates the characters that must be escaped for the grep and sed commands to do their work correctly. Very handy for fixing hacked html files.