Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,582 results
tar czv file1 file2 folder1 | ssh user@server tar zxv -C /destination
2009-01-29 10:38:26
User: xsawyerx
Functions: ssh tar
13

it compresses the files and folders to stdout, secure copies it to the server's stdin and runs tar there to extract the input and output to whatever destination using -C. if you emit "-C /destination", it will extract it to the home folder of the user, much like `scp file user@server:`.

the "v" in the tar command can be removed for no verbosity.

svn status |grep '\?' |awk '{print $2}'| xargs svn add
2009-01-29 10:33:22
User: xsawyerx
Functions: xargs
13

checks which files are not under version control, fetches the names and runs them through "svn add". WARNING: doesn't work with white spaces.

ping google.com | tee ping-output.txt
2009-01-29 10:26:59
User: root
Functions: ping tee
2

The tee (as in "T" junction) command is very useful for redirecting output to two places.

perl -pi.bk -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:51:11
User: xsawyerx
Functions: perl
10

the addition of ".bk" to the regular "pie" idiom makes perl create a backup of every file with the extension ".bk", in case it b0rks something and you want it back

perl -pi -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:47:01
User: xsawyerx
Functions: perl
0

The "g" at the end is for global, meaning replace all occurrences and not just the first one.

reset
2009-01-28 22:22:01
User: root
Functions: reset
256

If you bork your terminal by sending binary data to STDOUT or similar, you can get your terminal back using this command rather than killing and restarting the session. Note that you often won't be able to see the characters as you type them.

git log master | awk '/commit/ {id=$2} /\s+\w+/ {print id, $0}'
2009-01-28 13:32:08
User: root
Functions: awk
3

Useful when quickly looking for a commit id from a branch to use with git cherry-pick.

URL=www.example.com && wget -rq --spider --force-html "http://$URL" && find $URL -type d > url-list.txt && rm -rf $URL
2009-01-27 17:59:08
User: root
Functions: find rm wget
1

This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.

wget -r -l1 --no-parent -nH -nd -P/tmp -A".gif,.jpg" http://example.com/images
2009-01-27 17:31:22
User: root
Functions: wget
33

This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.

watch -n 30 uptime
2009-01-27 14:49:21
User: root
Functions: watch
3

This runs the uptime command every 30 seconds to avoid an SSH connection dropping due to inactivity. Granted there are better ways of solving this problem but this is sometimes the right tool for the job.

svn add --force *
2009-01-27 10:53:27
User: root
1

The --force option bypasses the warning if files are already in SVN.

ssh user@host "ps aux | grep httpd | wc -l"
2009-01-27 00:46:17
User: root
Functions: ssh
2

This counts the number of httpd processes running.

(cd /tmp && ls)
find /path/to/dir -type f -exec grep \-H "search term" {} \;
2009-01-26 16:32:14
User: root
Functions: find grep
-1

Simple use of find and grep to recursively search a directory for files that contain a certain term.

zip -r myfile.zip * -x \*.svn\*
rsync -av -e ssh user@host:/path/to/file.txt .
2009-01-26 13:39:24
User: root
Functions: rsync
1

You will be prompted for a password unless you have your public keys set-up.

^foo^bar
2009-01-26 13:25:37
User: root
446

Really useful for when you have a typo in a previous command. Also, arguments default to empty so if you accidentally run:

echo "no typozs"

you can correct it with

^z
cp file.txt{,.bak}
2009-01-26 12:11:29
User: root
Functions: cp
34

Uses shell expansion to create a back-up called file.txt.bak

grep -o "\(new \(\w\+\)\|\w\+::\)" file.php | sed 's/new \|:://' | sort | uniq -c | sort
2009-01-26 12:08:47
User: root
Functions: grep sed sort uniq
-2

This grabs all lines that make an instantation or static call, then filters out the cruft and displays a summary of each class called and the frequency.

sed '1000000!d;q' < massive-log-file.log
2009-01-26 11:50:00
User: root
Functions: sed
18

Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using

sed '999995,1000005!d' < my_massive_file
find /path/to/dir -type f -print0 | xargs -0 rm
2009-01-26 11:30:47
User: root
Functions: find xargs
12

Using xargs is better than:

find /path/to/dir -type f -exec rm \-f {} \;

as the -exec switch uses a separate process for each remove. xargs splits the streamed files into more managable subsets so less processes are required.

find . -name "*.php" -exec grep \-H "new filter_" {} \;
2009-01-26 10:43:09
User: root
Functions: find grep
0

This greps all PHP files for a given classname and displays both the file and the usage.

sudo !!
2009-01-26 10:26:48
User: root
986

Useful when you forget to use sudo for a command. "!!" grabs the last run command.

alias cr='find . 2>/dev/null -regex '\''.*\.\(c\|cpp\|pc\|h\|hpp\|cc\)$'\'' | xargs grep --color=always -ni -C2'
2009-01-26 08:54:25
User: chrisdrew
Functions: alias grep xargs
0

Creates a command alias ('cr' in the above example) that searches the contents of files matching a set of file extensions (C & C++ source-code in the above example) recursively within the current directory. Search configured to be in colour, ignore-case, show line numbers and show 4 lines of context. Put in shell initialisation file of your choice. Trivially easy to use, e.g:

cr sha1_init
du | sort -gr > file_sizes
2009-01-26 01:12:54
User: chrisdrew
Functions: du sort
6

Recursively searches current directory and outputs sorted list of each directory's disk usage to a text file.