Listing directory content of a directory with a lot of entries

perl -le 'opendir DIR, "." or die; print while $_ = readdir DIR; closedir DIR'
Ever wanted to get the directory content with 'ls' or 'find' and had to wait minutes until something was printed? Perl to the rescue. The one-liner above(redirected to a file) took less than five seconds to run in a directory with more man 2 million files. One can adapt it to e.g. delete files that match a certain pattern.
Sample Output
(over 2'000'000 lines omitted)

By: bierik
2011-04-04 06:21:39

What Others Think

bashrc · 504 weeks and 2 days ago
What options to use with find? I tried it in the same directory and it took almost two minutes. The perl solution took less than five seconds.
bierik · 504 weeks and 2 days ago
Are you voting this command down because you personally don't like it? Or is it because it does not work? Because it worked fine for me, and it took seconds to complete.
SuperFly · 504 weeks and 2 days ago
The problem has to do with buffering, I think. Perl handles this differently than `ls`, but that's probably because Perl adds another layer of buffering. If that's indeed the case, `find` may not solve this issue. Anyone who actually knows what they're talking about?
kaedenn · 504 weeks and 1 day ago
Looking at the straces of perl and find, I see that find calls newfstatat for every single file it finds, blowing up the strace output file to about 200MB. Perl doesn't call newfstatat at all and its strace file is less than one MB. Why is this system call made? Can one suppress it?
bierik · 504 weeks and 1 day ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: