Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,621 results
ls -s | sort -nr | more
dtach -c /tmp/wires-mc mc
2009-06-17 22:18:25
User: wires
5

Starts midnightcommander and allows you to detach the console; use ctrl-\ to detach

Then at a later time you can reconnect using

dtach -a /tmp/wires-mc

In my experience dtach works much better for programs like irssi, mutt, mc, aptitude than screen does.

cp `ls -x1tr *.jpg | tail -n 1` newest.jpg
2009-06-17 20:32:04
User: Psychodad
Functions: cp tail
1

search the newest *.jpg in the directory an make a copy to newest.jpg. Just change the extension to search other files. This is usefull eg. if your webcam saves all pictures in a folder and you like the put the last one on your homepage. This works even in a directory with 10000 pictures.

grep -2 -iIr "err\|warn\|fail\|crit" /var/log/*
2009-06-17 19:41:04
User: miketheman
Functions: grep
6

Using the grep command, retrieve all lines from any log files in /var/log/ that have one of the problem states

dd if=/dev/zero of=testfile.txt bs=1M count=10
2009-06-17 17:06:16
User: mstoecker
Functions: dd
Tags: dd size test file
1

This will create a 10 MB file named testfile.txt. Change the count parameter to change the size of the file.

As one commenter pointed out, yes /dev/random can be used, but the content doesn't matter if you just need a file of a specific size for testing purposes, which is why I used /dev/zero. The file size is what matters, not the content. It's 10 MB either way. "Random" just referred to "any file - content not specific"

qlook() { qlmanage -p "$@" >& /dev/null & }
pbpaste > newfile.txt
find . -type f ! -perm /g=r -exec chmod g+r {} +
2009-06-17 13:39:59
User: sanmiguel
Functions: chmod find
Tags: find chmod
3

Makes any files in the current directory (and any sub-directories) group-readable.

Using the "! -perm /g=r" limits the number of files to only those that do not already have this property

Using "+" on the end of the -exec body tells find to build the entire command by appending all matching files before execution, so invokes chmod once only, not once per file.

alias launchpadkey="sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys"
2009-06-17 12:02:27
User: azeey
Functions: alias
Tags: alias apt-key
8

Makes it easy to add keys to new ppa sources entries in apt sources.list

Now to add the key for the chromium-daily ppa:

launchpadkey 4E5E17B5
find . -type f -name *.ext -exec cat {} > file.txt \;
2009-06-17 11:33:14
User: realgt
Functions: cat find
2

Useful if you have to put together multiple files into one and they are scattered across subdirectories. For example: You need to combine all .sql files into one .sql file that would be sent to DBAs as a batch script.

You do get a warning if you create a file by the same extension as the ones your searching for.

find . -type f -name *.sql -exec cat {} > BatchFile.txt \;

<<<"k=1024; m=k*k; g=k*m; g" bc
2009-06-17 10:35:10
User: mpb
8

There are two ways to use "here documents" with bash to fill stdin:

The following example shows use with the "bc" command.

a) Using a delimiter at the end of data:

less-than less-than eeooff bc

> k=1024

> m=k*k

> g=k*m

> g

> eeooff

1073741824

b) using the "inline" verion with three less-than symbols:

less-than less-than less-than "k=1024; m=k*k; g=k*m; g" bc

1073741824

One nice advantage of using the triple less-than version is that the command can easily be recalled

from command line history and re-executed.

PS: in this "description", I had to use the name "less-than" to represent the less-than symbol because the commandlinefu input text box seems to eat up the real less-than symbols. Odd.

for x in `seq -w 1 30`; do sar -b -f /var/log/sa/sa$x | gawk '/Average/ {print $2}'; done
grep -h -o '<[^/!?][^ >]*' * | sort -u | cut -c2-
2009-06-17 00:22:18
User: thebodzio
Functions: cut grep sort
Tags: sort grep cut
2

This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

paste -d ',:' file1 file2 file3
2009-06-17 00:11:04
User: thebodzio
Functions: paste
Tags: paste
7

In the above example all files have 4 lines. In "file1" consecutive lines are: "num, 1, 2, 3", in "file2": "name, Jack, Jim, Frank" and in "file3": "scores, 1300, 1100, 980". This one liner can save considerate ammount of time when you're trying to process serious portions of data. "-d" option allows one to set series of characters to be used as separators between data originating from given files.

grep -Eio '([[:alnum:]_.]+@[[:alnum:]_]+?\.[[:alpha:].]{2,6})' file.html
2009-06-16 20:19:47
User: wires
Functions: grep
3

find all email addresses in a file, printing each match. Addresses do not have to be alone on a line etc. For example you can grab them from HTML-formatted emails or CSV files, etc. Use a combination of

...|sort|uniq$

to filter them.

ruby -i.bkp -pe "gsub(/search/, 'replace')" *.php
2009-06-16 12:35:40
User: gustavgans
4

Search for the string "search" and replace it with the string "replace", on all files with the extension php in the curret folder. Do also a backup of each file with the extension "bkp".

perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
2009-06-16 06:39:08
User: obscurite
Functions: grep perl split
3

Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

opendiff <file1> <file2>
2009-06-16 03:22:52
User: claytron
-3

This command will open up the two files in FileMerge on OS X. You can also compare two directories.

opendiff directory1 directory2

NOTE: FileMerge is a part of the OS X Developer Tools, available on the install disc.

sort -k1.x
2009-06-16 00:04:21
User: leper421
Functions: sort
3

Tells sort to ignore all characters before the Xth position in the first field per line. If you have a list of items one per line and want to ignore the first two characters for sorting purposes, you would type "sort -k1.3". Change the "1" to change the field being sorted. The decimal value is the offset in the specified field to sort by.

(uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -s "Pandaren!" someone@cmdfu.com
2009-06-15 11:34:51
User: LrdShaper
Functions: mailx uuencode
1

If you're users have ever asked your script to email their reports in separate attachments instead of tar'ring them into one file, then you can use this. You'll need the mailx package of course. In Unix you'd want to add an additional parameter "-m"

(uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -m -s "Hooosa!" someone@cmdfu.com

killall -2 mpg321
2009-06-15 03:04:00
User: dattaway
Functions: killall
1

Uses process signal to play next selection

ESC *
2009-06-14 21:17:40
User: Josay
Tags: completion ESC
59

Pressing ESC then * will insert in the command line the results of the autocompletion.

It's hard to explain but if you look the sample output or do

echo ESC *

you will understand quickly.

By the way, few reminders about ESC :

- Hold ESC does the same thing as tab tab

- 'ESC .' inserts the last argument of last command (can be done many times in order to get the last argument of all previous commands)

(Command too long..See sample Output..)
2009-06-14 20:34:37
User: mohan43u
Tags: bash sed echo tr od
-3

curl doesn't provide url-encoding for 'GET' data, it have an option '--data-urlencode', but its only for 'POST' data. Thats why I need to write down this commandline. With 'perl', 'php' and 'python', this is one liner, but just I wrote it for fun. Works in Ubuntu, will work in all linux varients(I hope it will work in unix varients also).

sed -e "s/\[{/\n/g" -e "s/}, {/\n/g" sessionstore.js | grep url | awk -F"," '{ print $1 }'| sed -e "s/url:\"\([^\"]*\)\"/\1/g" -e "/^about:blank/d" > session_urls.txt
2009-06-14 15:08:31
User: birnam
Functions: awk grep sed
2

This will extract all of the urls from a firefox session (including urls in a tab's history). The sessionstore.js file is in ~/.mozilla/firefox/{firefox profile}

curl -u <user>:<password> -d status="Amarok, now playing: $(dcop amarok default nowPlaying)" http://twitter.com/statuses/update.json
2009-06-14 02:42:34
User: caiosba
6

Share your "now playing" Amarok song in twitter!