Search for files and list the 20 largest.
find . -type f
gives us a list of file, recursively, starting from here (.)
-print0 | xargs -0 du -h
separate the names of files with NULL characters, so we're not confused by spaces
then xargs run the du command to find their size (in human-readable form -- 64M not 64123456)
| sort -hr
use sort to arrange the list in size order. sort -h knows that 1M is bigger than 9K
| head -20
finally only select the top twenty out of the list
Show Sample Output
This will quickly display files last changed in a directory, with the newest on top. Show Sample Output
Original command: cat "log" | grep "text to grep" | awk '{print $1}' | sort -n | uniq -c | sort -rn | head -n 100 This is a waste of multiple cats and greps, esp when awk is being used
by determining most popular use in history using percentage . Show Sample Output
Reads n lines from stdin and puts the contents in a variable. Yes, I know the read command and its options, but find this logical even for one line. Show Sample Output
Find top 5 big files
This command is more robust because it handles spaces, newlines and control characters in filenames. It uses printf, not ls, to determine file size.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: