What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



find the biggest files recursively, no matter how many

Terminal - find the biggest files recursively, no matter how many
find . -type f -printf '%20s %p\n' | sort -n | cut -b22- | tr '\n' '\000' | xargs -0 ls -laSr
2009-08-13 13:13:33
User: fsilveira
Functions: cut find ls sort tr xargs
find the biggest files recursively, no matter how many

This command will find the biggest files recursively under a certain directory, no matter if they are too many. If you try the regular commands ("find -type f -exec ls -laSr {} +" or "find -type f -print0 | xargs -0 ls -laSr") the sorting won't be correct because of command line arguments limit.

This command won't use command line arguments to sort the files and will display the sorted list correctly.


There are 6 alternatives - vote for the best!

Terminal - Alternatives
find . -type f|perl -lne [email protected]=sort {$b->[0]<=>$a->[0]}[(stat($_))[7],$_],@x;splice(@x,11);print "@{$x[0]}";END{for(@x){print "@$_"}'
2012-01-08 14:43:43
User: bazzargh
Functions: find perl
Tags: sort perl find

A different approach to the problem - maintain a small sorted list, print the largest as we go, then the top 10 at the end. I often find that the find and sort take a long time, and the large file might appear near the start of the find. By printing as we go, I get better feedback. The sort used in this will be much slower on perls older than 5.8.

Know a better way?

If you can do better, submit your command here.

What others think

the sort manpage is a little cryptic, but you can sort on fields other than the beginning of the line (similar to cut):

find . -type f -ls | sort -n --key=7

Pipe that to "cut -b68-" to get only the filenames.

Comment by bwoodacre 414 weeks and 3 days ago

I thought about "find -ls" before but it is bad because the file name isn't always at the same position, depending on the owner/group/size/time/date strings length. The time/date length changes for some specific locales.

Comment by fsilveira 414 weeks and 3 days ago

agreed! Another shoddy way is to pipe into "awk '{print $11}' " or some such to get the filenames, but they still have to be quoted or null-separated. However, I think you can get rid of cut and tr if you modify the -printf format string:

find . -type f -printf '%s\t"%p"\n' | sort -n | cut -f2 | xargs ls -laS

the tab is for cut and the quotes for xargs. Another option: on a single fs, use inode numbers to avoid messy filenames.

Comment by bwoodacre 414 weeks and 3 days ago

The 'cut' could not be avoided in that command and the 'tr' is to handle filenames with spaces correctly, although filenames with '\n' (which are *much* harder to be found) will be missed.

Comment by fsilveira 414 weeks and 2 days ago

I see, here's maybe even a shorter version. Have you seen the -print0 option to find?

find -type f -print0 | xargs -0 ls -la | sort -nr --key=5

this handles ANY type of filename even those with newlines and spaces.

Comment by bwoodacre 413 weeks and 5 days ago

Your point of view

You must be signed in to comment.