Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,621 results
sed "s:/old/direcory/:/new/directory/:" <file>
2009-08-06 00:37:45
Functions: sed
Tags: sed
8

Having to escape forwardslashes when using sed can be a pain. However, it's possible to instead of using / as the separator to use : .

I found this by trying to substitute $PWD into my pattern, like so

sed "s/~.*/$PWD/" file.txt

Of course, $PWD will expand to a character string that begins with a / , which will make sed spit out an error such as "sed: -e expression #1, char 8: unknown option to `s'".

So simply changing it to

sed "s:~.*:$PWD:" file.txt

did the trick.

find . -type f -print0 | xargs -0 -P 4 -n 40 grep -i foobar
2009-08-05 23:18:44
User: ketil
Functions: find grep xargs
4

xargs -P N spawns up to N worker processes. -n 40 means each grep command gets up to 40 file names each on the command line.

echo "Body goes here" | mutt -s "A subject" -a /path/to/file.tar.gz recipient@example.com
2009-08-05 23:06:25
User: ketil
Functions: echo
9

This command uses mutt to send the mail. You must pipe in a body, otherwise mutt will prompt you for some stuff. If you don't have mutt, it should be dead easy to install.

make -j 4
2009-08-05 22:50:57
User: kovan
Functions: make
16

Force make command to create as many compile processes as specified (4 in the example), so that each one goes into one core or CPU and compilation happens in parallel. This reduces the time required to compile a program by up to a half in the case of CPUs with 2 cores, one fourth in the case of quad cores... and so on.

colormake, colorgcc, colordiff
2009-08-05 22:40:11
User: kovan
3

Colorize output of make, gcc/g++ or diff, making it easier to read at a glance.

They are not distributed with make, diff or gcc, but are usually available in the repositories.

grep -Eho '<[a-ZA-Z_][a-zA-Z0-9_-:]*' * | sort -u | cut -c2-
2009-08-05 21:54:29
User: inkel
Functions: cut grep sort
Tags: sort grep cut xml
0

This one will work a little better, the regular expressions it is not 100% accurate for XML parsing but it will suffice any XML valid document for sure.

<ctrl+j>stty sane<ctrl+j>
2009-08-05 21:50:07
User: CharlieInCO
10

This is more or less the same as 'reset', but with two advantages: the initial LF character makes sure you're starting a new line to the tty driver, the final one is more reliably a line-end as CR is often unset; and second, 'stty sane' is reliable on older UNIX systems, especially Berkeley-based ones.

echo <ctrl-v><esc>c<enter>
2009-08-05 18:32:28
User: kcm
Functions: echo
9

This works in some situations where 'reset' and the other alternatives don't.

for file in *.mp3;do mkdir -p "$(mp3info -p "%a/%l" "$file")" && ln -s "$file" "$(mp3info -p "%a/%l/%t.mp3" "$file")";done
2009-08-05 17:04:34
User: matthewbauer
Functions: file ln mkdir
5

This will mv all your mp3 files in the current directory to $ARTIST/$ALBUM/$NAME.mp3

Make sure not to use sudo - as some weird things can happen if the mp3 file doesn't have id3 tags.

cat /etc/debian_version
2009-08-05 14:47:05
User: caiosba
Functions: cat
-5

Easy way to find out what Debian version your machine is running

find . -name "*.gz" | xargs -n 1 -I {} bash -c "gunzip -c {} | sort | gzip -c --best > {}.new ; rm {} ; mv {}.new {}"
2009-08-05 14:16:15
User: kennethjor
Functions: bash find xargs
-2

I used this because I needed to sort the content of a bunch of gzipped log files. Replace sort with something else, or simply remove sort to just rezip everything

find . -depth -type d -empty -exec rmdir -v {} +
2009-08-05 13:48:13
User: syssyphus
Functions: find rmdir
Tags: find
7

this will show the names of the deleted directories, and will delete directories that only no files, only empty directories.

cat file-that-failed-to-download.zip | curl -C - http://www.somewhere.com/file-I-want-to-download.zip >successfully-downloaded.zip
2009-08-05 13:33:06
Functions: cat
-1

If you are downloading a big file (or even a small one) and the connection breaks or times out, use this command in order to RESUME the download where it failed, instead of having to start downloading from the beginning. This is a real win for downloading debian ISO images over a buggy DSL modem.

Take the partially downloaded file and cat it into the STDIN of curl, as shown. Then use the "-C -" option followed by the URL of the file you were originally downloading.

grep -r . /sys/class/net/eth0/statistics
2009-08-05 08:20:39
User: olorin
Functions: grep
Tags: Linux
4

Within /proc and /sys there are a lot of subdirectories, which carry pseudofiles with only one value as content. Instead of cat-ing all single files (which takes quite a time) or do a "cat *" (which makes it hard to find the filename/content relation), just grep recursively for . or use "grep . /blabla/*" (star instead of -r flag).

For better readability you might also want to pipe the output to "column -t -s : ".

indent -linux helloworld.c
2009-08-05 07:53:08
User: freestyler
Tags: indent
6

put "-linux" option into $HOME/.indent.pro to make it default

curl -u user -d status="Tweeting from the shell" http://twitter.com/statuses/update.xml
wget --spider -v http://www.server.com/path/file.ext
typeset -f <function-name>
2009-08-04 17:07:21
User: log0
1

Display the code of a previously defined shell function.

find . -empty -type d -exec rmdir {} +
2009-08-04 16:55:34
User: jsiei97
Functions: find rmdir
14

A quick way to find and delete empty dirs, it starts in the current working directory.

If you do find . -empty -type d you will see what could be removed, or to a test run.

tar cf - dir_to_cp/ | (cd path_to_put/ && tar xvf -)
2009-08-04 16:51:31
User: jsiei97
Functions: cd tar
1

Just a copy of a big dir when you wan't things like ownership and date etc etc to be untouched.

Note: Updated with the ideas from "mpb".

echo linux|rev
2009-08-04 12:28:44
Functions: echo
Tags: rev
1

NAME

rev - reverse lines of a file or files

SYNOPSIS

rev [file ...]

DESCRIPTION

The rev utility copies the specified files to the standard output, reversing the order of characters in every line. If no files are specified, the standard input is read.

AVAILABILITY

The rev command is part of the util-linux-ng package and is available from ftp://ftp.kernel.org/pub/linux/utils/util-linux-ng/.

exiftool '-Directory<DateTimeOriginal' -d %Y/%m/%d dir
2009-08-04 11:47:34
User: karel1980
Tags: exiftool
15

This command would move the file "dir/image.jpg" with a "DateTimeOriginal" of "2005:10:12 16:05:56" to "2005/10/12/image.jpg".

This is a literal example from the exiftool man page, very useful for classifying photo's. The possibilities are endless.

stat -f '%Sp %p %N' * | rev | sed -E 's/^([^[:space:]]+)[[:space:]]([[:digit:]]{4})[^[:space:]]*[[:space:]]([^[:space:]]+)/\1 \2 \3/' | rev
2009-08-04 08:45:20
User: vwal
Functions: rev sed stat
2

Since the original command (#1873) didn't work on FreeBSD whose stat lacks the "-c" switch, I wrote an alternative that does. This command shows also the fourth digit of octal format permissions which yields the sticky bit information.

rsync -a --delete --link-dest=../lastbackup $folder $dname/
2009-08-04 07:08:54
User: pamirian
Functions: rsync
6

dname is a directory named something like 20090803 for Aug 3, 2009. lastbackup is a soft link to the last backup made - say 20090802. $folder is the folder being backed up. Because this uses hard linking, files that already exist and haven't changed take up almost no space yet each date directory has a kind of "snapshot" of that day's files. Naturally, lastbackup needs to be updated after this operation. I must say that I can't take credit for this gem; I picked it up from somewhere on the net so long ago I don't remember where from anymore. Ah, well...

Systems that are only somewhat slicker than this costs hundreds or even thousands of dollars - but we're HACKERS! We don't need no steenkin' commercial software... :)

find -type f -exec md5sum '{}' ';' | sort | uniq --all-repeated=separate -w 33 | cut -c 35-
2009-08-04 07:05:12
User: infinull
Functions: cut find md5sum sort uniq
18

Calculates md5 sum of files. sort (required for uniq to work). uniq based on only the hash. use cut ro remove the hash from the result.