Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,926 results
nmap -sT -p 80 -oG - 192.168.1.* | grep open
2009-02-11 17:47:27
User: bendavis78
Functions: grep
18

Change the -p argument for the port number. See "man nmap" for different ways to specify address ranges.

cat .ssh/id_rsa.pub | ssh hostname 'cat >> .ssh/authorized_keys'
2009-02-11 17:40:12
User: bendavis78
Functions: cat hostname ssh
14

Just run the command, type your password, and that's the last time you need to enter your password for that server.

This assumes that the server supports publickey authentication. Also, the permissions on your home dir are 755, and the permissions on your .ssh dir are 700 (local and remote).

alias dir="ls -al"
2009-02-11 16:51:01
User: katylava
Functions: alias
0

If you come from a DOS background and accidentally use DOS commands often, this and others like it can be helpful. Add to your .bash_profile, or wherever you keep such things.

svn st | grep /main/java | awk '{print $2}' | xargs echo | xargs svn ci -m "my comment here"
alt + .
2009-02-11 15:26:34
User: vbs100
7

Insert the last argument to the previous command

for ip in $(seq 1 25); do ping -c 1 192.168.0.$ip>/dev/null; [ $? -eq 0 ] && echo "192.168.0.$ip UP" || : ; done
2009-02-11 14:57:34
Functions: echo ping seq
0

this is very useful when there is a different network host to determine which are turned on or not

xrandr --output LVDS --auto --output VGA --auto --right-of LVDS
2009-02-11 13:20:17
User: root
7

You'll need to make sure your xorg.conf permits a virtual screen size this big. If it doesn't then xrandr should return a suitable error message that tells you the required size.

history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
rar a -m0 "${PWD##*/}.rar" *
2009-02-11 13:06:34
1

This command creates a rar archive from all files in the current folder and names the archive after the folder name.

dd if=/dev/zero bs=1M | ssh somesite dd of=/dev/null
du -sh
!!:gs/foo/bar
2009-02-11 10:20:15
User: Tronks
188

Very useful for rerunning a long command changing some arguments globally.

As opposed to ^foo^bar, which only replaces the first occurrence of foo, this one changes every occurrence.

diff <(head -500 product-feed.xml) <(head -500 product-feed.xml.old)
2009-02-11 09:24:38
User: root
Functions: diff head
3

Useful for massive files where doing a full diff would take too long. This just runs diff on the first 500 lines of each. The use of subshells to feed STDIN is quite a useful construct.

echo $0
2009-02-11 08:58:01
User: yogsototh
Functions: echo
7

Return the current shell. It is better than print $SHELL which can sometimes return a false value.

while read line; do echo -e "$line@mail.com"; done < list.txt
pon dsl-provider
while true; do eject /dev/cdrom && eject -t /dev/cdrom; done
2009-02-11 04:41:19
User: fmdlc
Functions: eject
-18

This open the cd rom device and close it in a loop

FAIL2BAN=`ps ax | grep fail2ban | grep -v grep | awk {'print $1'}` && if [ -n "$FAIL2BAN" ]; then printf "\n[INFO] Fail2Ban is running and the PID is %s\n\n" $FAIL2BAN; else printf "\n [INFO] Fail2Ban is not running\n\n"; fi
2009-02-11 04:36:49
User: fmdlc
Functions: awk grep printf
-6

Check if Fail2Ban is running on the system and alert it with a message in the terminal

i=0; f=$(find . -type f -iregex ".*jpg");c=$(echo $f|sed "s/ /\n/g"| wc -l);for x in $f;do i=$(($i + 1));echo "$x $i of $c"; mogrify -strip $x;done
rsync -e "/usr/bin/ssh -p22" -a --progress --stats --delete -l -z -v -r -p /root/files/ user@remote_server:/root/files/
2009-02-11 02:44:00
User: storm
Functions: rsync
7

Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22)

(all files & folders on destination server/folder will be deleted)

hostinfo.sh
2009-02-11 01:15:14
User: kongxx
-4

#!/bin/sh

_HOSTNAME=`hostname`

_HOSTTYPE=`echo $HOSTTYPE`

_MACHINETYPE=`echo $MACHTYPE`

_OSTYPE=`echo $OSTYPE`

_VENDOR=`echo $VENDOR`

_KERNEL=`uname -r | awk -F- '{print $1}'`

_GLIBC=`ls /lib/libc-*.so | awk -F- '/lib/ {print $2}' | awk -F. '{print $1"."$2}'`

_MEM=`cat /proc/meminfo | awk '/MemTotal/ {print $2 $3}'`

_CPU=`cat /proc/cpuinfo | grep 'cpu MHz' | awk '{print $4}'`

echo '=============================='

echo 'HOSTNAME ' $_HOSTNAME

echo 'HOSTTYPE ' $_HOSTTYPE

echo 'MACHINETYPE ' $_MACHINETYPE

echo 'OSTYPE ' $_OSTYPE

echo 'VENDOR ' $_VENDOR

echo 'KERNEL ' $_KERNEL

echo 'GLIBC ' $_GLIBC

echo 'MEM INFO ' $_MEM

echo 'CPU INFO ' $_CPU

echo '=============================='

find ./* -name 'CVS' | awk '{print "dos2unix " $1 "/*"}' | awk '{system($0)}'
mogrify -resize 1024 *.jpg
2009-02-11 00:01:29
User: mitzip
1

This command requires the imagemagick libraries and will resize all files with the .jpg extension to a width of 1024 pixels and will keep the same proportions as the original image.

alias apt-get='sudo apt-get'
2009-02-10 22:45:49
User: mogsie
Functions: alias
-4

apt-get must be run as root, and it is useless to run it as your own user. So just run it as root. Saves you the "sudo !!" every time you're adding a package.

watch -n 15 curl -s --connect-timeout 10 http://www.google.com/
2009-02-10 21:48:45
User: dltj
Functions: watch
6

If your web server is down, this command will periodically attempt to connect to it. If the output is blank, your server is not yet up. If you see HTML, your server is up. Obviously, you need to replace the Google URL with your web server URL...

* 'watch' -- a command for re-executing a command and displaying

the output

* '-n 15' -- tells watch to redo the command every 15 seconds

* 'curl' -- a handy utility for getting the source of a web page

* '-s' -- tells curl to be silent about failing

* '--connect-timeout 10' -- Try to connect for 10 seconds