Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,927 results
for /L %%x in (1,1,16) do mkdir %%x & curl -R -e http://www.kirtu.com -o %%x/#1.jpg http://www.kirtu.com/toon/content/sb%x/english/sb%x_en_[001-070].jpg
2009-12-08 15:01:16
User: MyTechieself
Functions: mkdir
-1

Bulk downloads the comic strip JPG files for the adult cartoon Savitabhabhi, storing each set in it's own folder. Requires manual removal of "non-image" files that maybe created because each series may differ in length. The command can be easily adapted for UNIX flavours. You need to have cURL in your path.

security unlock-keychain; security find-generic-password -ga "/Users/mruser/.ssh/id_dsa" 2>&1 > /dev/null
2010-02-02 21:14:57
-1

This must be run the first time while logged into your Mac desktop, as it will graphically prompt for access permissions. Subsequent uses will not prompt, assuming you select "Always allow".

ps -C apache o pid= | sed 's/^/-p /' | xargs strace
mysql -uUser -pPassword -N -s -r -e 'SHOW PROCESSLIST' | grep -cv "SHOW PROCESSLIST"
2011-06-10 13:00:53
Functions: grep
-1

Counts Mysql Threads

mysql switches

-N skip column names (remove headers)

-s silent mode (removes separator chars)

-r raw output

-e execute

grep switches

-c count lines

-v invert match (match all except)

find . -name unit-test -o -name '*.c' -o -name '*.cpp' | egrep -v "unit-test|android"
scrot -e 'mv $f \$HOME/shots/; sitecopy -u shots; echo "\$BASE/$f" | xsel -i; feh `xsel -o`'
2009-03-26 12:08:39
User: penpen
Functions: echo
-1

Here $HOME/shots must exist and have appropriate access rights and sitecopy must be correctly set up to upload new screen shots to the remote site.

Example .sitecopyrc (for illustration purposes only)

site shots

server ftp.example.com

username user

password antabakadesuka

local /home/penpen/shots

remote public_html/shots

permissions ignore

The command uses scrot to create a screen shot, moves it to the screen shot directory, uploads it using screen uses xsel to copy the URL to the paste buffer (so that you can paste it with a middle click) and finally uses feh to display a preview of the screen shot.

Note that $BASE stands for the base URL for the screen shots on the remote server, replace it by the actual location; in the example http://www.example.com/~user/shots would be fitting.

Assign this command to a key combination or an icon in whatever panel you use.

awk '{if (NR % 3 == 1) print $0}' foo > foo_every3_position1; awk '{if (NR % 3 == 2) print $0}' foo > foo_every3_position2; awk '{if (NR % 3 == 0) print $0}' foo > foo_every3_position3
2010-01-08 04:20:06
User: oshazard
Functions: awk
-1

awk extract every nth line.

Generic is:

awk '{if (NR % LINE == POSITION) print $0}' foo

where "last" position is always 0 (zero).

for url in `cat urls `; do title=`curl $url 2>&1 | grep -i '<title>.*</title>'` && curl $url > /tmp/u && mail -s "$title" your-private-instapaper-address@instapaper.com < /tmp/u ; done
2010-10-16 19:10:19
Functions: grep mail
-1

Note, you need to replace the email address with your private Instapaper email address.

There are a bunch of possible improvements such as,

- Not writing a temp file

- Doesnt strip tags (tho Instapaper does thankfully)

- Shouldnt require 2 curls

find / -type f -name IMG_????.JPG -print0 |xargs -0 exiv2 -g Exif.Canon.ModelID '{}' |grep A520 |rev |cut --complement -d " " -f1-40 |rev |xargs -I {} cp --parents {} /where
2012-03-10 03:01:01
User: fladam
Functions: cp cut find grep rev xargs
-1

You must spezify /where folder and / folder

If you have another camera you must experiment with Exif data (after -g and after grep) and mask of your photo files IMG_????.JPG

I have do it on Knoppix 6.7.0

You must have installed exiv2.

echo foo | ncat [ip address] [port]
2012-10-26 10:53:47
User: dragonauta
Functions: echo
-1

you can use a pair of commands to test firewalls.

1st launch this command at destination machine

ncat -l [-u] [port] | cat

then use this command at source machine to test remote port

echo foo | ncat [-u] [ip address] [port]

First command will listen at specified port.

It will listen TCP. If you use -u option will listen UDP.

Second command will send "foo" through ncat and will reach defined IP and port.

perl -ne 'print "$. - $_"' infile.txt
2009-12-08 15:27:39
User: netp
Functions: perl
-1

This command prints all lines of a file together with is line number.

curl -s http://twitter.com/users/show.xml?screen_name=username | sed -n 's/\<followers_count\>//p' | sed 's/<[^>]*>//g;/</N;//b'
2010-10-17 16:08:46
User: chrismccoy
Functions: sed
Tags: twitter
-1

replace username with the username you wish to check.

tar cfJ tarfile.tar.xz pathnames
2010-11-18 05:34:17
User: jasonjgw
Functions: tar
-1

The J option is a recent addition to GNU tar. The xz compression utility is required as well.

url=`curl http://proxybay.info/ | awk -F'href="|" |">|</' '{for(i=2;i<=NF;i=i+4) print $i,$(i+2)}' | grep follow|sed 's/^.\{19\}//'|shuf -n 1` && firefox $url
2014-10-04 19:08:13
User: dunryc
Functions: awk grep sed
-1

polls the pirate bay mirrors list and chooses a random site and opens it for you in firefox

while killall -USR1 dd; do sleep 5; done
2009-11-09 00:27:33
User: Mikachu
Functions: killall sleep
-1

Stops when the (last) dd process exits.

grep -n . datafile ;
wget --load-cookies <cookie-file> -c -i <list-of-urls>
alias foo="!!"
2010-08-12 23:42:15
User: smop
Functions: alias
-1

!! will expand to your previous command, thus creating the alias "foo" (does not work consistently for commands with quotation marks)

pbpaste | coffee -bcsp | tail -n +2
2013-09-13 04:50:27
User: roryokane
Functions: tail
-1

This particular combination of flags mimics Try CoffeeScript (on http://coffeescript.org/#try:) as closely as possible. And the `tail` call removes the comment `// Generated by CoffeeScript 1.6.3`.

See `coffee -h` for explanation of `coffee`'s flags.

httpd2 -M
tar xfzO <backup_name>.tar.gz | mysql -u root <database_name>
2011-02-10 22:18:42
User: alecnmk
Functions: tar
-1

`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)

nmap -sP -PR -oG - `/sbin/ip -4 addr show | awk '/inet/ {print $2}' | sed 1d`
2011-07-21 11:50:26
User: l3k
Functions: awk sed
-1

Today many hosts are blocking traditional ICMP echo replay for an "security" reason, so nmap's fast ARP scan is more usable to view all live IPv4 devices around you. Must be root for ARP scanning.

ls -d1a /var/www/*/web | xargs du -hs
2010-10-18 17:16:23
User: DRoBeR
Functions: du ls xargs
-1

Calculate foldersize for each website on an ISPConfig environment. It doesn't add the jail size. Just the "public_html".

grep --color -R "text" directory/
rename *.JPG *.jpg
2014-03-05 14:54:33
User: gtoal
Functions: rename
Tags: batch rename
-1

# Limited and very hacky wildcard rename

# works for rename *.ext *.other

# and for rename file.* other.*

# but fails for rename file*ext other*other and many more

# Might be good to merge this technique with mmv command...

mv-helper() {

argv="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //'`"

files="`echo \"$argv\"|sed -e \"s/ .*//\"`"

str="`history 1 | perl -pe 's/^ *[0-9]+ +[^ ]+ //' | tr -d \*`"

set -- $str

for file in $files

do

echo mv $file `echo $file|sed -e "s/$1/$2/"`

mv $file `echo $file|sed -e "s/$1/$2/"`

done

}

alias rename='mv-helper #'