Commands using du (213)

  • I had the problem that our monitoring showed that the "/" filesystem is >90% full. This command helped me to find out fast which subdirs are the biggest. The system has many NFS-mounts therefore the -x.


    2
    du -x / | sort -rn | less
    harpo · 2012-06-26 15:29:26 2

  • 0
    du -hs * | sort -h
    Woems · 2012-06-15 14:13:06 0

  • 0
    du -cah /path/to/folder/ | grep total
    ekinertac · 2012-05-29 23:30:03 0
  • Search for files and list the 20 largest. find . -type f gives us a list of file, recursively, starting from here (.) -print0 | xargs -0 du -h separate the names of files with NULL characters, so we're not confused by spaces then xargs run the du command to find their size (in human-readable form -- 64M not 64123456) | sort -hr use sort to arrange the list in size order. sort -h knows that 1M is bigger than 9K | head -20 finally only select the top twenty out of the list Show Sample Output


    9
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -20
    flatcap · 2012-03-30 10:21:12 3
  • from my bashrc ;)


    1
    find . -mount -type f -printf "%k %p\n" | sort -rg | cut -d \ -f 2- | xargs -I {} du -sh {} | less
    bashrc · 2012-03-30 07:37:52 0

  • 0
    du -s $(ls -l | grep '^d' | awk '{print $9}') | sort -nr
    j3ffyang · 2012-03-15 09:04:13 0
  • This one line Perl script will display the smallest to the largest files sizes in all directories on a server. Show Sample Output


    1
    du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf ("%6.1f\t%s\t%25s %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}' | more
    Q_Element · 2012-02-07 15:49:19 0

  • 1
    du --max-depth=1 | sort -nr | awk ' BEGIN { split("KB,MB,GB,TB", Units, ","); } { u = 1; while ($1 >= 1024) { $1 = $1 / 1024; u += 1 } $1 = sprintf("%.1f %s", $1, Units[u]); print $0; } '
    threv · 2011-12-08 17:43:09 1

  • 0
    du -ms * | sort -nr
    ronocdh · 2011-12-06 22:33:52 0
  • All folders, human-readable, no subfolder, with a total. Even shorter.


    -1
    du -sch *
    anarcat · 2011-12-06 18:38:20 0
  • i'm using gawk, you may get varying mileage with other varieties. You might want to change the / after du to say, /home/ or /var or something, otherwise this command might take quite some time to complete. Sorry it's so obsfucated, I had to turn a script into a one-liner under 255 characters for commandlinefu. Note: the bar ratio is relative, so the highest ratio of the total disk, "anchors" the rest of the graph. EDIT: the math was slightly wrong, fixed it. Also, made it compliant with older versions of df. Show Sample Output


    13
    t=$(df|awk 'NR!=1{sum+=$2}END{print sum}');sudo du / --max-depth=1|sed '$d'|sort -rn -k1 | awk -v t=$t 'OFMT="%d" {M=64; for (a=0;a<$1;a++){if (a>c){c=a}}br=a/c;b=M*br;for(x=0;x<b;x++){printf "\033[1;31m" "|" "\033[0m"}print " "$2" "(a/t*100)"% total"}'
    kevinquinnyo · 2011-12-01 01:21:11 9
  • This command simply outputs 10 files in human readable, that takes most space on your disk in current directory.


    4
    du -sh * | sort -rh | head
    sirex · 2011-11-16 06:01:02 0
  • In this case I'm just grabbing the next level of subdirectories (and same level regular files) with the --max-depth=1 flag. leaving out that flag will just give you finer resolution. Note that you have to use the -h switch with both 'du' and with 'sort.' Show Sample Output


    13
    du -h --max-depth=1 |sort -rh
    jambino · 2011-11-15 20:30:00 6
  • as per eightmillion's comment. Simply economical :)


    1
    du -h | sort -hr
    mooselimb · 2011-11-06 23:15:36 0
  • 6 characters counting whitespace!


    0
    du -sh *
    JeremyinNC · 2011-11-02 16:24:05 0
  • Shows the size of the directory the command is ran in. The size is in MB and GB. There is no need to type the path, its the current working directory. Show Sample Output


    0
    du -sh `pwd`
    djkee · 2011-10-30 08:47:23 0

  • 17
    du -h /path | sort -h
    moogmusic · 2011-09-02 13:26:23 5
  • Use this to find identify if dirs mostly contain large or small files. Show Sample Output


    2
    parallel echo -n {}"\ "\;echo '$(du -s {} | awk "{print \$1}") / $(find {} | wc -l)' \| bc -l ::: *
    unixmonkey8046 · 2011-07-28 12:21:34 1

  • 0
    du -h / | grep -w "[0-9]*G"
    pashutinsky · 2011-07-23 19:02:11 0
  • Specify the size in bytes using the 'c' option for the -size flag. The + sign reads as "bigger than". Then execute du on the list; sort in reverse mode and show the first 10 occurrences. Show Sample Output


    2
    find /myfs -size +209715200c -exec du -m {} \; |sort -nr |head -10
    arlequin · 2011-07-07 21:12:46 0
  • This command lists all the directories in SEARCHPATH by size, displaying their size in a human readable format. Show Sample Output


    -2
    SEARCHPATH=/var/; find $SEARCHPATH -type d -print0 | xargs -0 du -s 2> /dev/null | sort -nr | sed 's|^.*'$SEARCHPATH'|'$SEARCHPATH'|' | xargs du -sh 2> /dev/null
    moogmusic · 2011-07-06 08:21:58 0
  • Even simpler! Use du ... the -s and -c flags summarize and print a grand total of all files recursively. The -b flag prints in byte format. You can use the -h flag instead to print in human readable format. Show Sample Output


    2
    du -scb
    bbbco · 2011-06-27 14:20:11 1

  • 0
    du -kh --max-depth=1 | sort -n |head
    cyberion11 · 2011-04-26 13:32:21 0
  • If you're only using -m or -k, you will need to remember they are either in Megabyte or kilobyte forms. So by using -B, it gives you the unit of the size measurement, which helps you from reading the result faster. You can try with -B K as well. Show Sample Output


    1
    du --max-depth=1 -B M |sort -rn
    unixmonkey20397 · 2011-04-12 15:01:12 0
  • Credit goes to brun65i but he posted it as a comment instead as an alternative. I hadn't noticed the -h option on sort before and this seems like the cleanest alternative. Thanks Brun65i! Show Sample Output


    0
    du -h --max-depth=1 | sort -hr
    splante · 2011-04-07 18:01:18 0
  • ‹ First  < 2 3 4 5 6 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: