Use awk to sum and print the space used by a group of files. It works well as long as the space used is not bigger than 79094548.80... I found that upper limit when trying to find out what was the total amount of recoverable space from a set of directories: user@servername:/home/user/scripts>for dirName in aleph_bin aleph_sh aleph_work dailycheck INTERFAZ ; do echo "${dirName} = $(cat /tmp/purge_ocfs_dir.*.log | awk '{sum+=$5} END {printf "%4.2f", sum}') "; done aleph_bin = 79094548.80 aleph_sh = 79094548.80 aleph_work = 79094548.80 dailycheck = 79094548.80 INTERFAZ = 79094548.80 In the worst case scenario, the total number might be almost 137G. user@servername:/home/user/scripts>df -h /ocfs/* Filesystem Size Used Avail Use% Mounted on //argalephfsprod/aleph_bin$ 137G 38G 99G 28% /ocfs/aleph_bin //argalephfsprod/aleph_sh$ 137G 38G 99G 28% /ocfs/aleph_sh //argalephfsprod/aleph_work$ 280G 135G 146G 49% /ocfs/aleph_work //argalephfsprod/dailycheck$ 137G 38G 99G 28% /ocfs/dailycheck //argalephfsprod/INTERFAZ/ 137G 38G 99G 28% /ocfs/INTERFAZ Any suggestion about how to get the correct amount of space for total over 80 Mbytes? Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: