Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

All commands from sorted by
Terminal - All commands - 12,411 results
cat /etc/issue
iptables -A INPUT -s 222.35.138.25/32 -j DROP
2009-02-02 12:42:04
User: root
Functions: iptables
10

This appends (-A) a new rule to the INPUT chain, which specifies to drop all packets from a source (-s) IP address.

wget -O - http://www.commandlinefu.com/commands/browse/rss 2>/dev/null | awk '/\s*<title/ {z=match($0, /CDATA\[([^\]]*)\]/, b);print b[1]} /\s*<description/ {c=match($0, /code>(.*)<\/code>/, d);print d[1]"\n"} '
curl --basic --user "user:pass" --data-ascii "status=tweeting%20from%20%the%20linux%20command%20line" http://twitter.com/statuses/update.json
2009-01-30 18:08:35
User: g__j
2

great for outputting tweets from cron jobs and batch scripts

echo -e "[mysql]\npager=less -niSFX" >> ~/.my.cnf
2009-01-29 11:45:52
User: boombastic
Functions: echo
5

Changes standard mysql client output to 'less'.

In another words makes query results of mysql command line client to look much better.

tar czv file1 file2 folder1 | ssh [email protected] tar zxv -C /destination
2009-01-29 10:38:26
User: xsawyerx
Functions: ssh tar
13

it compresses the files and folders to stdout, secure copies it to the server's stdin and runs tar there to extract the input and output to whatever destination using -C. if you emit "-C /destination", it will extract it to the home folder of the user, much like `scp file [email protected]:`.

the "v" in the tar command can be removed for no verbosity.

svn status |grep '\?' |awk '{print $2}'| xargs svn add
2009-01-29 10:33:22
User: xsawyerx
Functions: xargs
13

checks which files are not under version control, fetches the names and runs them through "svn add". WARNING: doesn't work with white spaces.

ping google.com | tee ping-output.txt
2009-01-29 10:26:59
User: root
Functions: ping tee
2

The tee (as in "T" junction) command is very useful for redirecting output to two places.

perl -pi.bk -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:51:11
User: xsawyerx
Functions: perl
10

the addition of ".bk" to the regular "pie" idiom makes perl create a backup of every file with the extension ".bk", in case it b0rks something and you want it back

perl -pi -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:47:01
User: xsawyerx
Functions: perl
0

The "g" at the end is for global, meaning replace all occurrences and not just the first one.

reset
2009-01-28 22:22:01
User: root
Functions: reset
298

If you bork your terminal by sending binary data to STDOUT or similar, you can get your terminal back using this command rather than killing and restarting the session. Note that you often won't be able to see the characters as you type them.

git log master | awk '/commit/ {id=$2} /\s+\w+/ {print id, $0}'
2009-01-28 13:32:08
User: root
Functions: awk
3

Useful when quickly looking for a commit id from a branch to use with git cherry-pick.

URL=www.example.com && wget -rq --spider --force-html "http://$URL" && find $URL -type d > url-list.txt && rm -rf $URL
2009-01-27 17:59:08
User: root
Functions: find rm wget
1

This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.

wget -r -l1 --no-parent -nH -nd -P/tmp -A".gif,.jpg" http://example.com/images
2009-01-27 17:31:22
User: root
Functions: wget
35

This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.

watch -n 30 uptime
2009-01-27 14:49:21
User: root
Functions: watch
3

This runs the uptime command every 30 seconds to avoid an SSH connection dropping due to inactivity. Granted there are better ways of solving this problem but this is sometimes the right tool for the job.

svn add --force *
2009-01-27 10:53:27
User: root
1

The --force option bypasses the warning if files are already in SVN.

ssh [email protected] "ps aux | grep httpd | wc -l"
2009-01-27 00:46:17
User: root
Functions: ssh
2

This counts the number of httpd processes running.

find /path/to/dir -type f -exec grep \-H "search term" {} \;
2009-01-26 16:32:14
User: root
Functions: find grep
-1

Simple use of find and grep to recursively search a directory for files that contain a certain term.

zip -r myfile.zip * -x \*.svn\*
rsync -av -e ssh [email protected]:/path/to/file.txt .
2009-01-26 13:39:24
User: root
Functions: rsync
1

You will be prompted for a password unless you have your public keys set-up.

^foo^bar
2009-01-26 13:25:37
User: root
531

Really useful for when you have a typo in a previous command. Also, arguments default to empty so if you accidentally run:

echo "no typozs"

you can correct it with

^z
cp file.txt{,.bak}
2009-01-26 12:11:29
User: root
Functions: cp
40

Uses shell expansion to create a back-up called file.txt.bak

grep -o "\(new \(\w\+\)\|\w\+::\)" file.php | sed 's/new \|:://' | sort | uniq -c | sort
2009-01-26 12:08:47
User: root
Functions: grep sed sort uniq
-2

This grabs all lines that make an instantation or static call, then filters out the cruft and displays a summary of each class called and the frequency.

sed '1000000!d;q' < massive-log-file.log
2009-01-26 11:50:00
User: root
Functions: sed
21

Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using

sed '999995,1000005!d' < my_massive_file
find /path/to/dir -type f -print0 | xargs -0 rm
2009-01-26 11:30:47
User: root
Functions: find xargs
12

Using xargs is better than:

find /path/to/dir -type f -exec rm \-f {} \;

as the -exec switch uses a separate process for each remove. xargs splits the streamed files into more managable subsets so less processes are required.