What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Recursive Line Count

Terminal - Recursive Line Count
find ./ -not -type d | xargs wc -l | cut -c 1-8 | awk '{total += $1} END {print total}'
2009-04-01 15:14:15
User: benschw
Functions: awk cut find wc xargs
Recursive Line Count


There is 1 alternative - vote for the best!

Terminal - Alternatives
find * -type f -not -name ".*" | xargs wc -l
2010-05-21 21:03:31
User: Leechael
Functions: find wc xargs

We use `-not -name ".*"` for the reason we must omit hidden files (which unnecessary). We can only show up total lines like this:

find * -type f -not -name ".*" | xargs wc -l | tail -1

Know a better way?

If you can do better, submit your command here.

What others think

Good command, although some parts are unecessary. :-)

Usage of 'cut' is superfluous, the $1 in the awk command will take the first (whitespace delimited) field on each line whether there are 1 or many fields per line.

Usage of xargs is good to do batch invocations, but you can have find do that for you as well (see the find manpage). So the command could be:

find ./ -not -type d -exec wc -l {} + | awk '{t+=$1} END {print t}'
Comment by bwoodacre 373 weeks and 3 days ago

Suggestion of improvement:

find * -type f -exec cat {} \;|wc -l
Comment by SlimG 373 weeks and 3 days ago

I really like sloccount for this...

Comment by xSmurf 373 weeks and 1 day ago

Actually it prints the wrong answer, cause `wc` will prints total in the end, so the result will be doubly. Problem occurs in both commends belong:

find ./ -not -type d | xargs wc -l | cut -c 1-8 | awk '{total += $1} END {print total}' find ./ -not -type d -exec wc -l {} + | awk '{t+=$1} END {print t}'

And follow up command can done it right:

find * -type f -exec cat {} \;|wc -l

But we have a faster one:

find * -type f -not -name ".*" | wc -l

We use `-not -name ".*"` for the reason we must omit hidden filed (which unnecessary). We can only show up total lines like this:

find * -type f -not -name ".*" | wc -l | tail -1
Comment by Leechael 314 weeks and 1 day ago

Sorry, I just submitted wrong commends, the right one is:

find * -type f -not -name ".*" | xargs wc -l

The one show up total only:

find * -type f -not -name ".*" | xargs wc -l | tail -1

I had make sure they are correct this time. XDD

Comment by Leechael 314 weeks and 1 day ago

Your point of view

You must be signed in to comment.