What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,926 results
apropos keyword
egrep -o '[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}' file.txt
find <directory path> -mtime +365 -and -not -type d -delete
git grep -l "your grep string" | xargs gedit
find ./ -type f -exec sed -i 's/\t/ /g' {} \;
grep -PL "\t" -r . | grep -v ".svn" | xargs sed -i 's/\t/ /g'
2009-05-28 08:52:14
User: root
Functions: grep sed xargs

Note that this assumes the application is an SVN checkout and so we have to throw away all the .svn files before making the substitution.

echo string | tr '[:lower:]' '[:upper:]'
sort -n <( for i in $(find . -maxdepth 1 -mindepth 1 -type d); do echo $(find $i | wc -l) ": $i"; done;)
sudo dd if=/dev/zero of=/swapfile bs=1024 count=1024000;sudo mkswap /swapfile; sudo swapon /swapfile
2009-05-27 21:10:50
User: dcabanis
Functions: dd mkswap sudo swapon

Create a temporary file that acts as swap space. In this example it's a 1GB file at the root of the file system. This additional capacity is added to the existing swap space.

ruby -e "puts (1..20).map {rand(10 ** 10).to_s.rjust(10,'0')}"
2009-05-27 19:52:53
User: sil

There's been a few times I've needed to create random numbers. Although I've done so in PERL, I've found Ruby is actually faster. This script generates 20 random "10" digit number NOT A RANDOM NUMBER. Replace 20 (1..20) with the amount of random numbers you need generated

find . -uid 0 -print0 | xargs -0 chown foo:foo
2009-05-27 19:52:13
User: abcde
Functions: chown find xargs

In the example, uid 0 is root. foo:foo are the user:group you want to make owner and group. '.' is the "current directory and below." -print0 and -0 indicate that filenames and directories "are terminated by a null character instead of by whitespace."

rar a -m5 -v5M -R myarchive.rar /home/
2009-05-27 15:53:18
User: piovisqui

a - archive

m5 - compression level, 0= lowest compression...1...2...3...4...5= max compression

-v5M split the output file in 5 megabytes archives, change to 700 for a CD, or 4200 for a DVD

R recursive for directories, do not use it for files

It's better to have the output of a compression already split than use the 'split' command after compression, would consume the double amount of disk space. Found at http://www.ubuntu-unleashed.com/2008/05/howto-create-split-rar-files-in-ubuntu.html

history -c
touch -amct [[CC]YY]MMDDhhmm[.ss] FILE
2009-05-27 14:33:22
User: sharfah
Functions: touch

-a for access time, -m for modification time, -c do not create any files, -t timestamp

(($RANDOM%6)) || echo 'hello world!'
2009-05-27 08:11:08
User: luishka
Functions: echo

ramdomize the execution of the command echo 'hello world!'

mysql -s -e "show processlist" |awk '{print $1}'
season=1; for file in $(ls) ; do dir=$(echo $file | sed 's/.*S0$season\(E[0-9]\{2\}\).*/\1/'); mkdir $dir ; mv $file $dir; done
2009-05-27 03:30:58
User: lonecat
Functions: echo file mkdir mv sed

It happened to me that I got a season of a tv-show which had all files under the same folder like /home/blah/tv_show/season1/file{1,2,3,4,5,...}.avi

But I like to have them like this:


So I can have both the srt and the avi on one folder without cluttering much. This command organizes everything assuming that the filename contains Exx where xx is the number of the episode.

You may need to set:


if your filenames have spaces.

type -all command
svn log fileName|cut -d" " -f 1|grep -e "^r[0-9]\{1,\}$"|awk {'sub(/^r/,"",$1);print "svn cat fileName@"$1" > /tmp/fileName.r"$1'}|sh
2009-05-27 02:11:58
User: fizz
Functions: awk cut grep
Tags: bash svn awk grep

exported files will get a .r23 extension (where 23 is the revision number)

last -n 20
2009-05-26 22:09:09
User: jipipayo
Functions: last

change 20 by the number of sessions you want to know (20 it's fair enough)

skill -KILL -t ttyS0
2009-05-26 21:47:33
User: jipipayo
Functions: skill

when sometimes do a "w" or "who" command and see an orphan console session from time ago, you could kill with this command.

where ttyS0 its the console to kill.

lsof -nP +p 24073 | grep -i listen | awk '{print $1,$2,$7,$8,$9}'
kill_daemon() { echo "Daemon?"; read dm; kill -15 $(netstat -atulpe | grep $dm | cut -d '/' -f1 | awk '{print $9}') }; alias kd='kill_daemon
2009-05-26 20:39:56
User: P17

Just find out the daemon with $ netstat -atulpe. Then type in his name and he gets the SIGTERM.

cal -y | tr '\n' '|' | sed "s/^/ /;s/$/ /;s/ $(date +%e) / $(date +%e | sed 's/./#/g') /$(date +%m | sed s/^0//)" | tr '|' '\n'
ip route show dev eth0 | awk '{print $7}'
2009-05-26 20:29:54
User: P17
Functions: awk route
Tags: IP
ip address show | grep eth0 | sed '1d' | awk '{print $2}'

does the same, but shows network-prefix.