Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using du from sorted by
Terminal - Commands using du - 190 results
while (( 1==1 )); do du -c . >> output.log; sleep 2; done; tail -f output.log
2010-07-12 17:23:45
User: aceiro
Functions: du sleep tail
-5

this command shows the space used in postgres directory.

du -shc .[^.]* * | grep [MG]
2010-07-06 10:13:18
User: rubo77
Functions: du grep
-2

shows only folders, that are MB or GB in total size

find . -type d -exec du -sk '{}' \; | awk '($1 < 2048) {print $2}'
2010-06-16 11:53:14
User: putnamhill
Functions: awk du find
4

Just shortened the awk a bit and removed sed. Edit: I'm assuming there are no spaces in the path. To support white space in pathname try:

awk '($1 < 2048) {sub(/^[0-9]+[ \t]+/,""); print $0}'
find . -type d -exec du -sk '{}' \; | awk '{ if ($1 <2000) print $0 }' | sed 's/^[0-9]*.//'
2010-06-16 09:37:56
User: mtron
Functions: awk du find sed
2

This command will search all subfolders of the current directory and list the names of the folders which contain less than 2 MB of data. I use it to clean up my mp3 archive and to delete the found folders pipe the output to a textfile & run:

while read -r line; do rm -Rv "$line"; done < textfile
du -hs /path/to/target
du -sm $dirname
du -h <Directory>
2010-06-04 03:14:37
User: Vasudev
Functions: du
-6

Prints the size of Directory in human readable format like KB MB or GB. If you want to see size each files and directories inside the directory use -a option as shown in second output and if you want a total sum then add -c option :)

watch 'find -maxdepth 1 -mindepth 1 -type d |xargs du -csh'
2010-05-19 13:13:57
User: shadycraig
Functions: du watch xargs
0

This command shows the size of directories below here, refreshing every 2s.

It will also track directories created after running the command (that what the find bit does).

du -s * | sort -nr | head
du -hs */
2010-04-11 22:48:31
User: manurevah
Functions: du
8

why make it complicated ?

: ]

--------------------

I just noticed someone else has posted this on this site before me (sorry I am now a duplicate :/)

http://www.commandlinefu.com/commands/view/4313

du -sh `ls -p | grep /`
du -sh * | grep -v '\.\/\.'
du --max-depth=1 | grep -v '\.\/\.'
du -cks * | sort -rn | while read size fname; do for unit in k M G T P E Z Y; do if [ $size -lt 1024 ]; then echo -e "${size}${unit}\t${fname}"; break; fi; size=$((size/1024)); done; done
find . -name 'pattern'| xargs du -hc
tar pcf - home | pv -s $(du -sb home | awk '{print $1}') --rate-limit 500k | gzip > /mnt/c/home.tar.gz
2010-04-02 15:29:03
User: Sail
Functions: awk du gzip tar
1

tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers.

du -kd | egrep -v "/.*/" | sort -n
2010-03-30 15:40:35
User: rmbjr60
Functions: du egrep sort
-1

Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.

du -hs *|grep M|sort -n
2010-03-25 19:20:24
User: tuxlifan
Functions: du grep sort
3

This is easy to type if you are looking for a few (hundred) "missing" megabytes (and don't mind the occasional K slipping in)...

A variation without false positives and also finding gigabytes (but - depending on your keyboard setup - more painful to type):

du -hs *|grep -P '^(\d|,)+(M|G)'|sort -n

(NOTE: you might want to replace the ',' according to your locale!)

Don't forget that you can

modify the globbing as needed! (e.g. '.[^\.]* *' to include hidden files and directories (w/ bash))

in its core similar to:

http://www.commandlinefu.com/commands/view/706/show-sorted-list-of-files-with-sizes-more-than-1mb-in-the-current-dir

( du -xSk || du -kod ) | sort -nr | head
2010-03-16 04:05:14
Functions: du sort
4

No need to type out the full OR clause if you know which OS you're on, but this is easy cut-n-paste or alias to get top ten directories by singleton.

To avoid the error output from du -xSk you could always 2>/dev/null but you might miss relevant STDERR.

du -sh some/directory
2010-02-21 02:08:28
User: Jacolyte
Functions: du
-5

Displays only the subtotal size of a directory with the -s option, and in human readable format.

du -x --max-depth=1 | sort -n | awk '{ print $2 }' | xargs du -hx --max-depth=0
2010-02-18 19:46:47
User: d34dh0r53
Functions: awk du sort xargs
4

Provides numerically sorted human readable du output. I so wish there was just a du flag for this.

du -sk ./* | sort -nr
2010-02-04 04:08:05
User: op4
Functions: du sort
4

full command below, would not let me put full command in text box

du -sk ./* | sort -nr | awk 'BEGIN{ pref[1]="K"; pref[2]="M"; pref[3]="G";} { total = total + $1; x = $1; y = 1; while( x > 1024 ) { x = (x + 1023)/1024; y++; } printf("%g%s\t%s\n",int(x*10)/10,pref[y],$2); } END { y = 1; while( total > 1024 ) { total = (total + 1023)/1024; y++; } printf("Total: %g%s\n",int(total*10)/10,pref[y]); }'

find . -type f -size +1100000k |xargs -I% du -sh %
2010-01-31 22:04:07
User: 4fthawaiian
Functions: du find xargs
1

simple find -> xargs sort of thing that I get a lot of use out of. Helps find huge files and gives an example of how to use xargs to deal with them. Tested on OSX snow leopard (10.6). Enjoy.

du -s * | sort -nr | head | cut -f2 | parallel -k du -sh
2010-01-28 12:59:14
Functions: cut du head sort
Tags: du xargs parallel
-2

If a directory name contains space xargs will do the wrong thing. Parallel https://savannah.nongnu.org/projects/parallel/ deals better with that.

dir='path to file'; tar cpf - "$dir" | pv -s $(du -sb "$dir" | awk '{print $1}') | tar xpf - -C /other/path
2010-01-19 19:05:45
User: starchox
Functions: awk dir du tar
Tags: copy tar cp
-2

This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.

You also may keep simple with, but you don't have the progress info:

cp -rpf /some/directory /other/path