Commands using du (244)

  • Sort disk usage from directories in the current directory Show Sample Output


    2
    du --max-depth=1 -h . | sort -h
    x3mboy · 2022-08-23 14:58:57 480

  • 1
    du -hc *
    eluis · 2009-02-05 17:04:21 18

  • 1
    du --max-depth=1 -m
    bseaver · 2009-02-16 15:48:12 260

  • 1
    sudo du -sh $(ls -d */) 2> /dev/null
    Code_Bleu · 2009-08-07 19:00:09 4
  • A little bit smaller, faster and should handle files with special characters in the name.


    1
    find . -maxdepth 1 ! -name '.' -execdir du -0 -s {} + | sort -znr | gawk 'BEGIN{ORS=RS="\0";} {sub($1 "\t", ""); print $0;}' | xargs -0 du -hs
    ashawley · 2009-09-11 16:07:39 7

  • 1
    watch -n 60 du /var/log/messages
    rbossy · 2009-10-27 14:53:41 3

  • 1
    du -ms * 2>/dev/null |sort -nr|head
    yooreck · 2009-11-23 16:06:40 3
  • simple find -> xargs sort of thing that I get a lot of use out of. Helps find huge files and gives an example of how to use xargs to deal with them. Tested on OSX snow leopard (10.6). Enjoy. Show Sample Output


    1
    find . -type f -size +1100000k |xargs -I% du -sh %
    4fthawaiian · 2010-01-31 22:04:07 8
  • tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers. Show Sample Output


    1
    tar pcf - home | pv -s $(du -sb home | awk '{print $1}') --rate-limit 500k | gzip > /mnt/c/home.tar.gz
    Sail · 2010-04-02 15:29:03 6

  • 1
    du -sm $dirname
    unixmonkey10174 · 2010-06-04 10:00:16 8
  • This command will search all subfolders of the current directory and list the names of the folders which contain less than 2 MB of data. I use it to clean up my mp3 archive and to delete the found folders pipe the output to a textfile & run: while read -r line; do rm -Rv "$line"; done < textfile


    1
    find . -type d -exec du -sk '{}' \; | awk '{ if ($1 <2000) print $0 }' | sed 's/^[0-9]*.//'
    mtron · 2010-06-16 09:37:56 3
  • very handy if you copy or download a/some file(s) and want to know how big it is at the moment


    1
    while true; do du -s <file_or_directory>; sleep <time_interval>; done
    potatoface · 2010-08-24 19:55:13 3
  • Display the size (human reading) of all the directories in your home path (~). Show Sample Output


    1
    du -sh ~/*
    unixmonkey13748 · 2010-11-05 10:20:16 6
  • Often you need to find the files that are taking up the most disk space in order to free up space asap. This script can be run on the enitre filesystem as root or on a home directory to find the largest files. Show Sample Output


    1
    find / -type f 2>/dev/null | xargs du 2>/dev/null | sort -n | tail -n 10 | cut -f 2 | xargs -n 1 du -h
    mxc · 2010-11-09 13:45:11 6
  • Greater than 500M and sorted by size.


    1
    find . -type f -size +500M -exec du {} \; | sort -n
    PhillipNordwall · 2010-11-09 18:15:44 3

  • 1
    find / -type f -size +100M -exec du {} \; | sort -n | tail -10 | cut -f 2
    PhillipNordwall · 2010-11-09 18:34:49 3
  • This combines the above two command into one. Note that you can leave off the last two commands and simply run the command as "find /home/ -type f -exec du {} \; 2>/dev/null | sort -n | tail -n 10" The last two commands above just convert the output into human readable format.


    1
    find /home/ -type f -exec du {} \; 2>/dev/null | sort -n | tail -n 10 | xargs -n 1 du -h 2>/dev/null
    mxc · 2010-11-10 07:24:17 3
  • du -m option to not go across mounts (you usually want to run that command to find what to destroy in that partition) -a option to also list . files -k to display in kilobytes sort -n to sort in numerical order, biggest files last tail -10 to only display biggest 10


    1
    du . -mak|sort -n|tail -10
    georgesdev · 2010-12-03 19:28:55 2
  • Show the top 10 file size


    1
    find -type f | xargs -I{} du -sk "{}" | sort -rn | head
    glaudiston · 2011-01-04 11:10:02 3
  • If you're only using -m or -k, you will need to remember they are either in Megabyte or kilobyte forms. So by using -B, it gives you the unit of the size measurement, which helps you from reading the result faster. You can try with -B K as well. Show Sample Output


    1
    du --max-depth=1 -B M |sort -rn
    unixmonkey20397 · 2011-04-12 15:01:12 7
  • as per eightmillion's comment. Simply economical :)


    1
    du -h | sort -hr
    mooselimb · 2011-11-06 23:15:36 3

  • 1
    du --max-depth=1 | sort -nr | awk ' BEGIN { split("KB,MB,GB,TB", Units, ","); } { u = 1; while ($1 >= 1024) { $1 = $1 / 1024; u += 1 } $1 = sprintf("%.1f %s", $1, Units[u]); print $0; } '
    threv · 2011-12-08 17:43:09 4
  • This one line Perl script will display the smallest to the largest files sizes in all directories on a server. Show Sample Output


    1
    du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf ("%6.1f\t%s\t%25s %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}' | more
    Q_Element · 2012-02-07 15:49:19 10
  • from my bashrc ;)


    1
    find . -mount -type f -printf "%k %p\n" | sort -rg | cut -d \ -f 2- | xargs -I {} du -sh {} | less
    bashrc · 2012-03-30 07:37:52 3
  • This command give a human readable result without messing up the sorting.


    1
    for i in G M K; do du -hx /var/ | grep [0-9]$i | sort -nr -k 1; done | less
    jlaunay · 2012-06-26 22:57:17 6
  • ‹ First  < 2 3 4 5 6 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

File rotation without rename command
Rotates log files with "gz"-extension in a directory for 7 days and enumerates the number in file name. i.e.: logfile.1.gz > logfile.2.gz I needed this line due to the limitations on AIX Unix systems which do not ship with the rename command.

Hypnosis
rather don't depend on those sleepin'pills http://en.wikipedia.org/wiki/Counting_sheep

Remove a range of lines from a file

Generate Random Passwords
If you want a password length longer than 6, changing the -c6 to read -c8 will give you 8 random characters instead of 6. To end up with a line-feed, use this with echo: # echo `< /dev/urandom tr -dc _A-Z-a-z-0-9 | head -c6` Modern systems need higher strenght, so add some special characters: # < /dev/urandom tr -dc '12345!@#$%qwertQWERTasdfgASDFGzxcvbZXCVB' | head -c8

Mount/unmount your truecrypted file containers
This should automatically mount it to /media/truecrypt1. Further mounts will go to /media/truecrypt2, and so on. You shouldn't need sudo/su if your permissions are right. I alias tru='truecrypt' since tr and true are commands. To explicitly create a mount point do: tru volume.tc /media/foo To make sure an GUI explorer window (nautilus, et al) opens on the mounted volume, add: --explorer To see what you currently have mounted do: tru -l To dismount a volume do: tru -d volume.tc. To dismount all mounted volumes at once do: tru -d Tested with Truecrypt v6.3a / Ubuntu 9.10

Easy Persistent SSH Connections Using Screen
Use as: $ s host1 Will ssh to remote host upon first invocation. Then use C-a d to detatch. Running "s host1" again will resume the shell session on the remote host. Only useful in LAN environment. You'd want to start the screen on the remote host over a WAN. Adapted from Hack 34 in Linux Server Hacks 2nd Addition.

Zip each file in a directory individually with the original file name
This will list the files in a directory, then zip each one with the original filename individually. video1.wmv -> video1.zip video2.wmv -> video2.zip This was for zipping up large amounts of video files for upload on a Windows machine.

IBM AIX: Extract a .tar.gz archive in one shot
This command is for UNIX OSes that have plain vanilla System V UNIX commands instead of their more functional GNU counterparts, such as IBM AIX.

Check which files are opened by Firefox then sort by largest size.
Just refining last proposal for this check, showing awk power to make more complex math (instead /1024/1024, 2^20). We don't need declare variable before run lsof, because $(command) returns his output. Also, awk can perform filtering by regexp instead to call grep. I changed the 0.0000xxxx messy output, with a more readable form purging all fractional numbers and files less than 1 MB.

Calculate days on which Friday the 13th occurs (inspired from the work of the user justsomeguy)
Friday is the 5th day of the week, monday is the 1st. Output may be affected by locale.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: