Efficient count files in directory (no recursion)

perl -e 'if(opendir D,"."){@a=readdir D;print $#a-1,"\n"}'
time perl -e 'if(opendir D,"."){@a=readdir D;print $#a - 1,"\n"}' 205413 real 0m0.497s user 0m0.220s sys 0m0.268s time { ls |wc -l; } 205413 real 0m3.776s user 0m3.340s sys 0m0.424s ********* ** EDIT: turns out this perl liner is mostly masturbation. this is slightly faster: find . -maxdepth 1 | wc -l sh-3.2$ time { find . -maxdepth 1|wc -l; } 205414 real 0m0.456s user 0m0.116s sys 0m0.328s ** EDIT: now a slightly faster perl version perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' sh-3.2$ time perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' 205414 real 0m0.415s user 0m0.176s sys 0m0.232s

1
2009-07-23 20:14:33

What Others Think

what if you have to recurse? I don't know how to extend the perl to recurse, but it would be interesting to compare to find . -type f | wc -l On the speed difference, I'm guessing perl has an advantage since it really doesn't deal with the filename strings like ls must. Somehow ls|wc seems to use twice as much system call time as the perl but also spends more time in userspace presumably shuttling filenames around. Any ideas?
bwoodacre · 482 weeks and 4 days ago
And if it is the filename overhead, I wonder how much faster 'ls -i | wc -l' is.
bwoodacre · 482 weeks and 4 days ago
In my home, I got the following counts: $ find . -type f | wc -l 15812 $ perl -e 'if(opendir D,"."){@a=readdir D;print $#a-1,"\n"}' 109 $ ls |wc -l 14
kzh · 482 weeks and 4 days ago
@kzh you forgot to time the results, that is what is being measured! Also, your home dir is a bad place to test because there are many hidden files/directories which find/ls ignores as well as many levels of subdirs which perl or ls do not consider, that is why your file counts vary.
bwoodacre · 482 weeks and 3 days ago
~$ time perl -e 'if(opendir D,"."){@a=readdir D;print $#a - 1,"\n"}'280 real 0m0.117s user 0m0.010s sys 0m0.000s ~$ time { ls |wc -l; } 108 real 0m0.007s user 0m0.010s sys 0m0.000s Looks like your perl line does it ~15 times slower than the ls on my machine.
tedkozma · 482 weeks and 3 days ago
@bwoodacre you could do... perl -e 'use File::Find;$c=0;find sub {++$c;},".";print $c,"\n"' but it is not faster than find . | wc -l which made me realize that find . -maxdepth 1 | wc -l ...is faster than my original oneliner :} i'm going to make an edit with this comment
recursiverse · 482 weeks and 3 days ago
@tedkozma the perl liner should be faster on larger directories. my test directory had ~200k files.
recursiverse · 482 weeks and 3 days ago
a slight modification brings perl slightly ahead again (though not by much and only for very large directories)... perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c,"\n"'
recursiverse · 482 weeks and 3 days ago
ls -U1 | wc -l is much faster because it doesn't have to read in the whole directory listing into memory and sort it
hfs · 433 weeks and 4 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: