commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Prints the size of Directory in human readable format like KB MB or GB. If you want to see size each files and directories inside the directory use -a option as shown in second output and if you want a total sum then add -c option :)
This command shows the size of directories below here, refreshing every 2s.
It will also track directories created after running the command (that what the find bit does).
why make it complicated ?
: ]
--------------------
I just noticed someone else has posted this on this site before me (sorry I am now a duplicate :/)
Warning!, if the pattern didn't find anything it shows the total size of dot dir
tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers.
Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.
This is easy to type if you are looking for a few (hundred) "missing" megabytes (and don't mind the occasional K slipping in)...
A variation without false positives and also finding gigabytes (but - depending on your keyboard setup - more painful to type):
du -hs *|grep -P '^(\d|,)+(M|G)'|sort -n
(NOTE: you might want to replace the ',' according to your locale!)
Don't forget that you can
modify the globbing as needed! (e.g. '.[^\.]* *' to include hidden files and directories (w/ bash))
in its core similar to:
No need to type out the full OR clause if you know which OS you're on, but this is easy cut-n-paste or alias to get top ten directories by singleton.
To avoid the error output from du -xSk you could always 2>/dev/null but you might miss relevant STDERR.
Displays only the subtotal size of a directory with the -s option, and in human readable format.
Provides numerically sorted human readable du output. I so wish there was just a du flag for this.
full command below, would not let me put full command in text box
du -sk ./* | sort -nr | awk 'BEGIN{ pref[1]="K"; pref[2]="M"; pref[3]="G";} { total = total + $1; x = $1; y = 1; while( x > 1024 ) { x = (x + 1023)/1024; y++; } printf("%g%s\t%s\n",int(x*10)/10,pref[y],$2); } END { y = 1; while( total > 1024 ) { total = (total + 1023)/1024; y++; } printf("Total: %g%s\n",int(total*10)/10,pref[y]); }'
simple find -> xargs sort of thing that I get a lot of use out of. Helps find huge files and gives an example of how to use xargs to deal with them. Tested on OSX snow leopard (10.6). Enjoy.
If a directory name contains space xargs will do the wrong thing. Parallel https://savannah.nongnu.org/projects/parallel/ deals better with that.
This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.
You also may keep simple with, but you don't have the progress info:
cp -rpf /some/directory /other/path
You set the file/dirname transfer variable, in the end point you set the path destination, this command uses pipe view to show progress, compress the file outut and takes account to change the ssh cipher. Support dirnames with spaces.
Merged ideas and comments by http://www.commandlinefu.com/commands/view/4379/copy-working-directory-and-compress-it-on-the-fly-while-showing-progress and http://www.commandlinefu.com/commands/view/3177/move-a-lot-of-files-over-ssh
The original didn't use -print0 which fails on weird file names eg with spaces.
The original parsed the output of 'ls -l' which is always a bad idea.
What happens here is we tell tar to create "-c" an archive of all files in current dir "." (recursively) and output the data to stdout "-f -". Next we specify the size "-s" to pv of all files in current dir. The "du -sb . | awk ?{print $1}?" returns number of bytes in current dir, and it gets fed as "-s" parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way "pv" knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish.
Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/