commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.
sorts the files by integer megabytes, which should be enough to (interactively) find the space wasters. Now you can
for the above output,
dush -n 3
for only the 3 biggest files and so on. It's always a good idea to have this line in your .profile or .bashrc
This is easy to type if you are looking for a few (hundred) "missing" megabytes (and don't mind the occasional K slipping in)...
A variation without false positives and also finding gigabytes (but - depending on your keyboard setup - more painful to type):
du -hs *|grep -P '^(\d|,)+(M|G)'|sort -n
(NOTE: you might want to replace the ',' according to your locale!)
Don't forget that you can
modify the globbing as needed! (e.g. '.[^\.]* *' to include hidden files and directories (w/ bash))
in its core similar to:
Just how much space are those zillions of database logs taking up ? How much will you gain on a compression rate of say 80% ? This little line gives you a good start for your calculations.
This allows the output to be sorted from largest to smallest in human readable format.
OSX's BSD version of the du command uses the -d argument instead of --max-depth.
ncdu is a text-mode ncurses-based disk usage analyzer. Useful for when you want to see where all your space is going. For a single flat directory it isn't more elaborate than an du|sort or some such thing, but this analyzes all directories below the one you specify so space consumed by files inside subdirectories is taken into account. This way you get the full picture. Features: file deletion, file size or size on disk and refresh as contents change. Homepage: http://dev.yorhel.nl/ncdu
Shred can be used to shred a given partition or an complete disk. This should insure that not data is left on your disk
Very useful when you need disk space. It calculates the disk usage of all files and dirs (descending them) located at the current directory (including hidden ones). Then sort puts them in order.