Count all files in a directory

ls | wc -l
ls -1 shows one file per line (update: -1 was not really needed) wc -l counts the lines received from the previous command
Sample Output
345

-3
By: Sebasg
2013-01-22 03:35:35
ls wc

These Might Interest You

  • You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display. Best off putting this in your bashrc or bash_profile file Show Sample Output


    1
    largest() { dir=${1:-"./"}; count=${2:-"10"}; echo "Getting top $count largest files in $dir"; du -sx "$dir/"* | sort -nk 1 | tail -n $count | cut -f2 | xargs -I file du -shx file; }
    jhyland87 · 2013-01-21 09:45:21 0
  • Just want to post a Perl alternative. Does not count hidden files ('.' ones). Show Sample Output


    8
    perl -le 'print ~~ map {-s} <*>'
    MarxBro · 2012-02-21 21:09:48 0
  • This example summarize size of all pdf files in /tmp directory and its subdirectories (in bytes). Replace "/tmp" with directory path of your choice and "\*pdf" or even "-iname \*pdf" with your own pattern to match specific type of files. You can replace also parameter for du to count kilo or megabytes, but because of du rounding the sum will not be correct (especially with lot of small files and megabytes counting). In some cases you could probably use sth like this: du -cb `find /tmp -type f -iname \*pdf`|tail -n 1 But be aware that this second command CANNOT count files with spaces in their names and it will cheat you, if there are some files matching the pattern that you don't have rights to read. The first oneliner is resistant to such problems (it will not count sizes of files which you cant read but will give you correct sum of rest of them). Show Sample Output


    0
    SUM=0; for FILESIZE in `find /tmp -type f -iname \*pdf -exec du -b {} \; 2>/dev/null | cut -f1` ; do (( SUM += $FILESIZE )) ; done ; echo "sum=$SUM"
    alcik · 2009-03-05 17:16:52 3
  • I created this command to give me a quick overview of how many file types a directory, and all its subdirectories, contains. It works based off file extension, rather than file(1)'s magic output, because it ended up being more accurate and less confusing. Files that don't have an ext (README) are generally not important for me to want to count, but you're free to customize this fit your needs. Show Sample Output


    0
    printf "\n%25s%10sTOTAL\n" 'FILE TYPE' ' '; for ext in $(find . -iname \*.* | egrep -o '\.[^[:space:].]+$' | egrep -v '\.svn*' | sort -f | uniq -i); do count=$(find . -iname \*$ext | wc -l); printf "%25s%10s%d\n" $ext ' ' $count; done
    rkulla · 2010-04-16 21:12:11 0

What Others Think

-1 is not necessary since the output of ls is not going to a terminal.
cfajohnson · 282 weeks ago
Wow, so ls has a different behavior when going to a terminal than to a pipe! I didn't know that could happen. That could have been a big source of hard to find bugs for me!
Sebasg · 282 weeks ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: