What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands tagged Linux from sorted by
Terminal - Commands tagged Linux - 237 results
( ( while [ 2000 -ge "$(free -m | awk '/buffers.cache:/ {print $4}')" ] || [ $(echo "$(uptime | awk '{print $10}' | sed -e 's/,$//' -e 's/,/./') >= $(grep -c ^processor /proc/cpuinfo)" | bc) -eq 1 ]; do sleep 10; done; my-command > output.txt ) & )
2010-07-13 09:12:11
User: michelsberg
Functions: echo sleep

[ 2000 -ge "$(free -m | awk '/buffers.cache:/ {print $4}')" ] returns true if less than 2000 MB of RAM are available, so adjust this number to your needs.

[ $(echo "$(uptime | awk '{print $10}' | sed -e 's/,$//' -e 's/,/./') >= $(grep -c ^processor /proc/cpuinfo)" | bc) -eq 1 ] returns true if the current machine load is at least equal to the number of CPUs.

If either of the tests returns true we wait 10 seconds and check again. If both tests return false, i.e. 2GB are available and machine load falls below number of CPUs, we start our command and save it's output in a text file.

The ( ( ... ) & ) construct lets the command run in background even if we log out. See http://www.commandlinefu.com/commands/view/3115/ .

objdump -d ./PROGRAM|grep '[0-9a-f]:'|grep -v 'file'|cut -f2 -d:|cut -f1-6 -d' '|tr -s ' '|tr '\t' ' '|sed 's/ $//g'|sed 's/ /\\x/g'|paste -d '' -s |sed 's/^/"/'|sed 's/$/"/g'
net rpc -I ADDRESS -U USERNAME%PASSWORD service {stop|start} SVCNAME
2010-06-29 17:12:36
User: f0rk

Control (stop, start, restart) a Windows Service from a Linux machine which has the `net` command (provided by samba).

tar -czf ../header.tar.gz $(find . -name *.h)
2010-06-27 23:44:48
Functions: find tar
Tags: Linux tar

This is a shortcut to tar up all files matching a wildcard. Tar doesn't have the --include (apparently).

aptitude remove ?and(~i~nlinux-(im|he) ?not(~n`uname -r`))
2010-06-11 22:57:09
User: dbbolton

A little aptitude magic. Note: this will remove images AND headers. If you just want to remove images: aptitude remove ?and(~i~nlinux-im ?not(~n`uname -r`))

I used this in zsh without any problems. I'm not sure how other shells will interpret some of the special characters used in the aptitude search terms. Use -s to simulate.

aptitude purge linux-image | grep ^i | grep -v $(uname -r)
perl -e 'chomp($k=`uname -r`); for (</boot/vm*>) {s/^.*vmlinuz-($k)?//; $l.="linux-image-$_ ";} system "aptitude remove $l";'
aptitude remove $(dpkg -l|egrep '^ii linux-(im|he)'|awk '{print $2}'|grep -v `uname -r`)
2010-06-10 21:23:00
User: dbbolton
Functions: awk egrep grep

This should do the same thing and is about 70 chars shorter.

while inotifywait -r -e MODIFY dir/; do make; done;
while true; do inotifywait -r -e MODIFY dir/ && make; done;
2010-06-04 17:07:03
User: fain182

Uses inotifywait from inotify-tools ( http://wiki.github.com/rvoicilas/inotify-tools/ ), that is compatible only with linux.

Usefull when you work with files that have to be compiled.. latex, haml, c..

cowsay `fortune` | toilet --metal -f term
2010-06-03 21:48:54

Get colorful fortunes dictated by an ASCII cow. For full enjoyment you'll need to have color setup enabled for your terminal.

alias a=" killall rapidly_spawning_process"; a; a; a;
2010-05-20 02:33:28
User: raj77_in
Functions: alias
Tags: Linux unix kill

if you dont want to alias also then you can do

killall rapidly_spawning_process ; !! ; !! ; !!

killall rapidly_spawning_process ; killall rapidly_spawning_process ; killall rapidly_spawning_process
2010-05-20 00:26:10
Functions: killall
Tags: Linux unix kill

Use this if you can't type repeated killall commands fast enough to kill rapidly spawning processes.

If a process keeps spawning copies of itself too rapidly, it can do so faster than a single killall can catch them and kill them. Retyping the command at the prompt can be too slow too, even with command history retrieval.

Chaining a few killalls on single command line can start up the next killall more quickly. The first killall will get most of the processes, except for some that were starting up in the meanwhile, the second will get most of the rest, and the third mops up.

utime(){ python -c "import time; print(time.strftime('%a %b %d %H:%M:%S %Y', time.localtime($1)))"; }
utime(){ awk -v d=$1 'BEGIN{print strftime("%a %b %d %H:%M:%S %Y", d)}'; }
utime(){ date -d "1970-01-01 GMT $1 seconds"; }
utime { date -d @$1; }
2010-05-12 12:21:15
User: deltaray
Functions: date

More recent versions of the date command finally have the ability to decode the unix epoch time into a human readable date. This function makes it simple to utilize this feature quickly.

find /tmp -type f -atime +1 -delete
2010-05-11 17:08:49
User: mattoufoutu
Functions: find

Cleans all files in /tmp that have been accessed at least 2 days ago.

(for i in `find . -maxdepth 2 -name .svn | sed 's/.svn$//'`; do echo $i; svn info $i; done ) | egrep '^.\/|^URL'
2010-05-09 11:54:37
User: jespere
Functions: echo egrep info sed

If you have lots of subversion working copies in one directory and want to see in which repositories they are stored, this will do the trick. Can be convenient if you need to move to a new subversion server.

tr -d "\n\r" | grep -ioEm1 "<title[^>]*>[^<]*</title" | cut -f2 -d\> | cut -f1 -d\<
awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' | sed '/^$/d' > file.html
2010-04-20 13:27:47
User: tamouse
Functions: awk sed
Tags: Linux awk html

previous version leaves lots of blank lines

awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' file.html
2010-04-20 10:54:03
User: sata
Functions: awk
Tags: Linux awk html

Case Insensitive! and Works even if the "<title>...</title>" spans over multiple line.

Simple! :-)

cat file1 ... fileN > combinedFile;
2010-04-17 01:00:04
User: GinoMan2440
Functions: cat
Tags: cat bash Linux

the last person who posted used the most roundabout way to concatinate files, there's a reason there's a "conCATinate" command... Using this method, you also get to choose the order of the files, below another person just did *.txt > combined.txt which is fine but the order depends on the implementation of "cat" which is probably alphabetical order of filenames.

cat *.txt >output.txt
perl -i -pe 's/\r/\n/g' file