What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



All commands from sorted by
Terminal - All commands - 12,418 results
for a in `ls`; do echo $a && convert $a -resize <Width>x<Height> $a; done
2009-08-02 22:35:24
User: leavittx
Functions: echo

Resizes all images in the curent directory to x resolution.

It is better than `mogrify -resize *.jpg` because of independence from extension of image (e.g. .jpg and .JPG) (:

echo $((`eix --only-names -I | wc -l` * 100 / `eix --only-names | wc -l`))%
ssh -t user@host screen -x <screen name>
2009-08-02 15:39:24
User: Dark006
Functions: screen ssh

If you know the benefits of screen, then this might come in handy for you. Instead of ssh'ing into a machine and then running a screen command, this can all be done on one line instead. Just have the person on the machine your ssh'ing into run something like

screen -S debug

Then you would run

ssh -t user@host screen -x debug

and be attached to the same screen session.

wget `lynx -dump http://www.ebow.com/ebowtube.php | grep .flv$ | sed 's/[[:blank:]]\+[[:digit:]]\+\. //g'`
2009-08-02 14:09:53
User: spaceyjase
Functions: grep sed wget

I wanted all the 'hidden' .flv files from the http link in the command line; wget seemed appropriate, fed with output from lynx, grep the flv files and the normalised via sed (to remove the numeric bullet). Similar to the 'Grab mp3 files' fu. Replace link with your own, grep arg with something more interesting ;) See here for something along the same lines...


Hope you find it useful! Improvements welcome, naturally.

debootstrap --arch i386 lenny /opt/debian ftp://debian.das.ufsc.br/pub/debian/
yum clean all ; rpm -Uvh http://download.fedora.redhat.com/pub/fedora/linux/releases/11/Fedora/i386/os/Packages/fedora-release-11-1.noarch.rpm ; yum -y upgrade ; reboot
du -aB1m|awk '$1 >= 100'
net=DTAG-DIAL ; for (( i=1; i<30; i++ )); do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; done
2009-08-01 05:28:19
User: drizzt
Functions: grep sed whois

Useful if you f.i. want to block/allow all connections from a certain provider which uses successive netnames for his ip blocks. In this example I used the german Deutsche Telekom which has DTAG-DIAL followed by a number as netname for the dial in pools.

There are - as always ;) - different ways to do this. If you have seq available you can use

net=DTAG-DIAL ; for i in `seq 1 30`; do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; done

or without seq you can use bash brace expansion

net=DTAG-DIAL ; for i in {1..30}; do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; done

or if you like while better than for use something like

net=DTAG-DIAL ; i=1 ; while true ; do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; test $i = 30 && break ; i=$(expr $i + 1) ; done

and so on.

find . -maxdepth 2 -name "*somepattern" -print0 | xargs -0 -I "{}" echo mv "{}" /destination/path
2009-08-01 01:55:47
User: jonasrullo
Functions: echo find mv xargs

Only tested on Linux Ubunty Hardy. Works when file names have spaces. The "-maxdepth 2" limits the find search to the current directory and the next one deeper in this example. This was faster on my system because find was searching every directory before the current directory without the -maxdepth option. Almost as fast as locate when used as above. Must use double quotes around pattern to handle spaces in file names. -print0 is used in combination with xargs -0. Those are zeros not "O"s. For xargs, -I is used to replace the following "{}" with the incoming file-list items from find. Echo just prints to the command line what is happening with mv. mv needs "{}" again so it knows what you are moving from. Then end with the move destination. Some other versions may only require one "{}" in the move command and not after the -I, however this is what worked for me on Ubuntu 8.04. Some like to use -type f in the find command to limit the type.

seq 1 12 | sed 1,5d ; seq 1 12 | head --lines=-5
2009-08-01 00:41:52
User: flux
Functions: head sed seq
Tags: sed tail HEAD fun

Strangely enough, there is no option --lines=[negative] with tail, like the head's one, so we have to use sed, which is very short and clear, you see.

Strangely more enough, skipping lines at the bottom with sed is not short nor clear. From Sed one liner :

# delete the last 10 lines of a file

$ sed -e :a -e '$d;N;2,10ba' -e 'P;D' # method 1

$ sed -n -e :a -e '1,10!{P;N;D;};N;ba' # method 2

xinit -- :1
2009-07-31 23:42:28
User: flux
Tags: ssh X X11 xinit

This starts a very basic X session, with just a simple xterm. You can use this xterm to launch your preferred distant session.

ssh -X john@otherbox gnome-session

Try also startkde or fluxbox or xfce4-session.

To switch between your two X servers, use CTRL+ALT+F7 and CTRL+ALT+F8.

echo $( du -sm /var/log/* | cut -f 1 ) | sed 's/ /+/g'
2009-07-31 21:42:53
User: flux
Functions: cut du echo sed
Tags: echo bc

When you've got a list of numbers each on its row, the ECHO command puts them on a simple line, separated by space. You can then substitute the spaces with an operator. Finally, pipe it to the BC program.

curl -sI http://slashdot.org/ | sed -nr 's/X-(Bender|Fry)(.*)/\1\2/p'
2009-07-31 19:55:17
Functions: sed

I'm pretty sure everyone has curl and sed, but not everyone has lynx.

sudo apt-get -o Acquire::http::Dl-Limit=25 install <package>
2009-07-31 19:43:45
User: dunnix
Functions: install sudo
Tags: apt-get

apt-get is pretty aggressive when it downloads, potentially hogging the bandwidth of your network. The 25 is in KB, change this to your needs.

echo 1 > /proc/sys/kernel/sysrq; echo b > /proc/sysrq-trigger
2009-07-31 19:07:40
User: tiagocruz
Functions: echo
Tags: proc

Useful when you have some wrong on a server (nfs freeze/ immortal process)

netstat -ant | grep :80 | grep ESTABLISHED | awk '{print $5}' | awk -F: '{print $1}' | sort | uniq -c | sort -n
no_of_files=10; counter=1; while [[ $counter -le $no_of_files ]]; do echo Creating file no $counter; dd bs=1024 count=$RANDOM skip=$RANDOM if=/dev/sda of=random-file.$counter; let "counter += 1"; done
2009-07-31 16:34:47
User: rajaseelan
Functions: dd echo file
Tags: bash dd

Create a bunch of random files with random binary content. Basically dd dumps randomly from your hard disk to files random-file*.

dig +short txt <keyword>.wp.dg.cx
2009-07-31 16:08:59
User: drizzt
Functions: dig

Query Wikipedia by issuing a DNS query for a TXT record. The TXT record will also include a short URL to the complete corresponding Wikipedia entry.You can also write a little shell script like:

$ cat wikisole.sh


dig +short txt ${1}.wp.dg.cx

and run it like

./wikisole.sh unix

were your first option ($1) will be used as search term.

ps -o rss -C httpd | tail -n +2 | (sed 's/^/x+=/'; echo x) | bc
2009-07-31 15:15:08
Functions: echo ps sed tail

Display the amount of memory used by all the httpd processes. Great in case you are being Slashdoted!

find . -maxdepth 1 -type f | wc -l
2009-07-31 14:53:29
User: guckes
Functions: find wc
Tags: wc

A simple "ls" lists files *and* directories. So we need to "find" the files (type 'f') only.

As "find" is recursive by default we must restrict it to the current directory by adding a maximum depth of "1".

If you should be using the "zsh" then you can use the dot (.) as a globbing qualifier to denote plain files:

zsh> ls *(.) | wc -l

for more info see the zsh's manual on expansion and substitution - "man zshexpn".

ifconfig -a | perl -nle'/(\d+\.\d+\.\d+\.\d+)/ && print $1'
2009-07-31 09:49:17
User: sneaker
Functions: ifconfig perl

works on Linux and Solaris. I think it will work on nearly all *nix-es

ifconfig | awk '/ddr:[0-9]/ {sub(/addr:/, ""); print $2}'
2009-07-31 09:30:54
User: danny_b85
Functions: awk ifconfig
Tags: Linux ifconfig

The initial version of this command also outputted extra empty lines, so it went like this:

This happened on Ubuntu, i haven't tested on anything else.

pacman -Q|wc -l
echo -e "HEAD / HTTP/1.1\nHost: slashdot.org\n\n" | nc slashdot.org 80 | head -n5 | tail -1 | cut -f2 -d-
lynx -head -dump http://slashdot.org|egrep 'Bender|Fry'|sed 's/X-//'