What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using wc from sorted by
Terminal - Commands using wc - 156 results
find . -name \*.c | xargs wc -l | tail -1 | awk '{print $1}'
2009-09-08 08:25:45
User: karpoke
Functions: awk find tail wc xargs
Tags: awk find wc

This is really fast :)

time find . -name \*.c | xargs wc -l | tail -1 | awk '{print $1}'


real 0m0.191s

user 0m0.068s

sys 0m0.116s

find . -type f -name '*.c' -exec wc -l {} \; | awk '{sum+=$1} END {print sum}'
2009-09-04 15:51:30
User: arcege
Functions: awk find wc
Tags: awk find wc

Have wc work on each file then add up the total with awk; get a 43% speed increase on RHEL over using "-exec cat|wc -l" and a 67% increase on my Ubuntu laptop (this is with 10MB of data in 767 files).

find / -type f -exec wc -c {} \; | sort -nr | head -100
wget -q -O- PAGE_URL | grep -o 'WORD_OR_STRING' | wc -w
file -i * | grep 'text/plain' | wc -l
2009-08-16 21:22:46
User: voyeg3r
Functions: file grep wc

get files without extensions, get ASCII and utf-8 as "text/plain"

echo $((`eix --only-names -I | wc -l` * 100 / `eix --only-names | wc -l`))%
find . -maxdepth 1 -type f | wc -l
2009-07-31 14:53:29
User: guckes
Functions: find wc
Tags: wc

A simple "ls" lists files *and* directories. So we need to "find" the files (type 'f') only.

As "find" is recursive by default we must restrict it to the current directory by adding a maximum depth of "1".

If you should be using the "zsh" then you can use the dot (.) as a globbing qualifier to denote plain files:

zsh> ls *(.) | wc -l

for more info see the zsh's manual on expansion and substitution - "man zshexpn".

pacman -Q|wc -l
find . -type f -name "*.c" -exec cat {} \; | wc -l
2009-07-30 10:06:51
User: foremire
Functions: cat find wc

use find to grep all .c files from the target directory, cat them into one stream, then piped to wc to count the lines

head -$(($RANDOM % $(wc -l < file.txt) +1 )) file.txt | tail -1
lsof -p <process_id> | wc -l
$ grep -or string path/ | wc -l
find . -name "*.sql" -print0 | wc -l --files0-from=-
2009-06-22 17:45:03
User: vincentp
Functions: find wc
Tags: find wc count line

This command gives you the number of lines of every file in the folder and its subfolders matching the search options specified in the find command. It also gives the total amount of lines of these files.

The combination of print0 and files0-from options makes the whole command simple and efficient.

sort -n <( for i in $(find . -maxdepth 1 -mindepth 1 -type d); do echo $(find $i | wc -l) ": $i"; done;)
count() { find $@ -type f -exec cat {} + | wc -l; }
Q="reddit|digg"; F=*.log; awk -F\" '{print $4}' $F | egrep $Q | wc -l
2009-05-05 21:51:16
User: jaymzcd
Functions: awk egrep wc

I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter).

For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches).

ps -L -p <pid> | wc -l
find ./ -not -type d | xargs wc -l | cut -c 1-8 | awk '{total += $1} END {print total}'
wc -l `find . -name *.php`
date -d "@$(find dir -type f -printf '%C@\n' | sort -n | sed -n "$(($(find dir -type f | wc -l)/2))p")" +%F
2009-03-24 18:48:49
User: allengarvin
Functions: date dir find wc

I needed to get a feel for how "old" different websites were, based on their directories.

find . -name "*.EXT" | xargs grep -n "TODO" | wc -l
cmp -l file1.bin file2.bin | wc -l
# wc -l /var/log/security/writable.today
2009-03-19 12:25:52
User: mpb
Functions: wc

Mandriva Linux includes a security tool called "msec" (configurable via "draksec").

One of the many things it regularily checks for is world writeable files.

If any are found, it writes the list to /var/log/security/writable.today.

"wc -l" simply counts the number of lines in the file.

This number should be low.

Browse through /var/log/security/writable.today and consider if any of those files *need* to be world-writeable (and if not, modify the permissions. eg: "chmod o-w $file").

A large number of world-writeable files may indicate that umask is not correctly set in /etc/profile (or ${HOME}/.bash_profile) but could also indicate poor security configuration or even malicious activity.

for file in `find . -type f`; do cat $file; done | wc -l
find . -type f | wc -l