What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Optimal way of deleting huge numbers of files

Terminal - Optimal way of deleting huge numbers of files
find /path/to/dir -type f -print0 | xargs -0 rm
2009-01-26 11:30:47
User: root
Functions: find xargs
Optimal way of deleting huge numbers of files

Using xargs is better than:

find /path/to/dir -type f -exec rm \-f {} \;

as the -exec switch uses a separate process for each remove. xargs splits the streamed files into more managable subsets so less processes are required.


There is 1 alternative - vote for the best!

Terminal - Alternatives
rsync -a --delete empty-dir/ target-dir/
2016-06-07 16:56:55
User: malathion
Functions: rsync
Tags: delete rsync

This command works by rsyncing the target directory (containing the files you want to delete) with an empty directory. The '--delete' switch instructs rsync to remove files that are not present in the source directory. Since there are no files there, all the files will be deleted.

I'm not clear on why it's faster than 'find -delete', but it is.

Benchmarks here: https://web.archive.org/web/20130929001850/http://linuxnote.net/jianingy/en/linux/a-fast-way-to-remove-huge-number-of-files.html

find /path/to/dir -type f -delete
2009-12-09 01:30:52
User: SlimG
Functions: find

Optimal way of deleting huge numbers of files

Using -delete is faster than:

find /path/to/dir -type f -print0 | xargs -0 rm find /path/to/dir -type f -exec rm {} + find /path/to/dir -type f -exec rm \-f {} \;
find /path/to/dir/ -type f -exec rm {} +

Know a better way?

If you can do better, submit your command here.

What others think

cd /path/to/dir ; ls | xargs rm

Comment by speirs 398 weeks and 4 days ago

I think you can get find to delete for you with the -delete option, which I imagine might be fastest:

find /path/to/dir -type f -delete

or with the -exec xargs-like behavoir:

find /path/to/dir -type f -exec rm {} +

without launching the extra xargs process.

Comment by bwoodacre 390 weeks and 2 days ago

@speirs: Never parse ls: http://mywiki.wooledge.org/ParsingLs

@bwoodacre: +1 for the "+"

Comment by dennisw 382 weeks and 4 days ago

Benchmarked this using the drupal-6.14 files extracted from http://ftp.drupal.org/files/projects/drupal-6.14.tar.gz

time find drupal-6.14 -type f -delete

real 0m0.024s

user 0m0.004s

sys 0m0.020s

time find drupal-6.14 -type f -print0 | xargs -0 rm

real 0m0.030s

user 0m0.008s

sys 0m0.020s

time find drupal-6.14 -type f -exec rm {} +

real 0m0.032s

user 0m0.004s

sys 0m0.028s

time find drupal-6.14 -type f -exec rm \-f {} \;

real 0m1.700s

user 0m0.592s

sys 0m0.628s

Comment by SlimG 358 weeks and 2 days ago

how about this?

time rm -r drupal-6.14

rm -r drupal-6.14

0.00s user 0.04s system 76% cpu 0.049 total

find drupal-6.14 -type f -delete

find drupal-6.14 -type f -delete

0.00s user 0.04s system 108% cpu 0.041 total

Comment by bwoodacre 358 weeks and 1 day ago

rm -r will also delete subdirectories, whereas the find -type -f leaves the directory tree intact.

Just curious...why have people wanted to delete just the files and not the directories (and other stuff)?

Also, what does the plus do?

Comment by unixmonkey7434 357 weeks and 6 days ago

i have had this fail for me once on a machine with low ram, find couldn't expand the path, got the too many arguments error.

ls | grep string |xargs rm

worked in that situation.

Most other times find works fine :)

Comment by alf 338 weeks and 1 day ago

Your point of view

You must be signed in to comment.