Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,611 results
for url in `cat urls `; do title=`curl $url 2>&1 | grep -i '<title>.*</title>'` && curl $url > /tmp/u && mail -s "$title" your-private-instapaper-address@instapaper.com < /tmp/u ; done
2010-10-16 19:10:19
Functions: grep mail
-1

Note, you need to replace the email address with your private Instapaper email address.

There are a bunch of possible improvements such as,

- Not writing a temp file

- Doesnt strip tags (tho Instapaper does thankfully)

- Shouldnt require 2 curls

find / -type f -name IMG_????.JPG -print0 |xargs -0 exiv2 -g Exif.Canon.ModelID '{}' |grep A520 |rev |cut --complement -d " " -f1-40 |rev |xargs -I {} cp --parents {} /where
2012-03-10 03:01:01
User: fladam
Functions: cp cut find grep rev xargs
-1

You must spezify /where folder and / folder

If you have another camera you must experiment with Exif data (after -g and after grep) and mask of your photo files IMG_????.JPG

I have do it on Knoppix 6.7.0

You must have installed exiv2.

echo foo | ncat [ip address] [port]
2012-10-26 10:53:47
User: dragonauta
Functions: echo
-1

you can use a pair of commands to test firewalls.

1st launch this command at destination machine

ncat -l [-u] [port] | cat

then use this command at source machine to test remote port

echo foo | ncat [-u] [ip address] [port]

First command will listen at specified port.

It will listen TCP. If you use -u option will listen UDP.

Second command will send "foo" through ncat and will reach defined IP and port.

perl -ne 'print "$. - $_"' infile.txt
2009-12-08 15:27:39
User: netp
Functions: perl
-1

This command prints all lines of a file together with is line number.

curl -s http://twitter.com/users/show.xml?screen_name=username | sed -n 's/\<followers_count\>//p' | sed 's/<[^>]*>//g;/</N;//b'
2010-10-17 16:08:46
User: chrismccoy
Functions: sed
Tags: twitter
-1

replace username with the username you wish to check.

tar cfJ tarfile.tar.xz pathnames
2010-11-18 05:34:17
User: jasonjgw
Functions: tar
-1

The J option is a recent addition to GNU tar. The xz compression utility is required as well.

while killall -USR1 dd; do sleep 5; done
2009-11-09 00:27:33
User: Mikachu
Functions: killall sleep
-1

Stops when the (last) dd process exits.

grep -n . datafile ;
wget --load-cookies <cookie-file> -c -i <list-of-urls>
alias foo="!!"
2010-08-12 23:42:15
User: smop
Functions: alias
-1

!! will expand to your previous command, thus creating the alias "foo" (does not work consistently for commands with quotation marks)

pbpaste | coffee -bcsp | tail -n +2
2013-09-13 04:50:27
User: roryokane
Functions: tail
-1

This particular combination of flags mimics Try CoffeeScript (on http://coffeescript.org/#try:) as closely as possible. And the `tail` call removes the comment `// Generated by CoffeeScript 1.6.3`.

See `coffee -h` for explanation of `coffee`'s flags.

httpd2 -M
tar xfzO <backup_name>.tar.gz | mysql -u root <database_name>
2011-02-10 22:18:42
User: alecnmk
Functions: tar
-1

`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)

nmap -sP -PR -oG - `/sbin/ip -4 addr show | awk '/inet/ {print $2}' | sed 1d`
2011-07-21 11:50:26
User: l3k
Functions: awk sed
-1

Today many hosts are blocking traditional ICMP echo replay for an "security" reason, so nmap's fast ARP scan is more usable to view all live IPv4 devices around you. Must be root for ARP scanning.

ls -d1a /var/www/*/web | xargs du -hs
2010-10-18 17:16:23
User: DRoBeR
Functions: du ls xargs
-1

Calculate foldersize for each website on an ISPConfig environment. It doesn't add the jail size. Just the "public_html".

grep --color -R "text" directory/
rename *.JPG *.jpg
2014-03-05 14:54:33
User: gtoal
Functions: rename
Tags: batch rename
-1

# Limited and very hacky wildcard rename

# works for rename *.ext *.other

# and for rename file.* other.*

# but fails for rename file*ext other*other and many more

# Might be good to merge this technique with mmv command...

mv-helper() {

argv="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //'`"

files="`echo \"$argv\"|sed -e \"s/ .*//\"`"

str="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //' | tr -d \*`"

set -- $str

for file in $files

do

echo mv $file `echo $file|sed -e "s/$1/$2/"`

mv $file `echo $file|sed -e "s/$1/$2/"`

done

}

alias rename='mv-helper #'

httpd2 -V
tar jcpf /home/[usuario]/etc-$(hostname)-backup-$(date +%Y%m%d-%H%M%S).tar.bz2 /etc
2011-04-29 22:53:11
User: mack
Functions: date tar
-1

Simple Compressed Backup of the /etc

Linux compatible

cat /dev/zero > /dev/sda
2013-09-13 21:47:48
User: fhh
Functions: cat
-1

Exactly the same effect with 3 less characters ;-) (Removes all files/filesystems of a harddisk. It removes EVERYTHING of your hard disk. Be careful when to select a device.)

You can press Ctrl + C after few seconds

(No output)

folder=0;mkdir $folder; while find -maxdepth 1 -type f -exec mv "{}" $folder \; -quit ; do if [ $( ls $folder | wc -l ) -ge 100 ]; then folder=$(( $folder + 1 )); mkdir $folder; fi ; done
2011-02-11 21:28:01
User: Juluan
Functions: find ls mkdir mv wc
-1

If you have a folder with thousand of files and want to have many folder with only 100 file per folder, run this.

It will create 0/,1/ etc and put 100 file inside each one.

But find will return true even if it don't find anything ...

vim -p `ls *.java *.xml *.txt *.bnd 2>/dev/null`
lsof -i
2011-10-03 02:06:30
User: shsingh
-1

This option selects the listing of all Internet and x.25 (HP-UX) network files.

sshostnew () {sed -i "$1d" $HOME/.ssh/known_hosts ; }
2011-11-07 10:33:04
User: _john
Tags: ssh sed
-1

If you work in an environment, where some ssh hosts change regularly this might be handy...

sed -re '/^#/d ; s/#.*$//'
2012-02-01 20:39:23
User: Zulu
Functions: sed
Tags: sed
-1

Delete all comments (#) on text :

It deletes the entire comment line and remove comments form end of others.