Commands tagged file size (8)


  • 9
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -10
    netaxiz · 2012-06-30 10:03:31 1

  • 2
    find . -type f -print0 | xargs -0 du -h | sort -hr | head
    mesuutt · 2012-06-29 12:43:06 3
  • You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display. Best off putting this in your bashrc or bash_profile file Show Sample Output


    1
    largest() { dir=${1:-"./"}; count=${2:-"10"}; echo "Getting top $count largest files in $dir"; du -sx "$dir/"* | sort -nk 1 | tail -n $count | cut -f2 | xargs -I file du -shx file; }
    jhyland87 · 2013-01-21 09:45:21 0

  • 1
    find . -type f -size +100M
    chikondi · 2013-02-07 11:58:10 0
  • Here's a way to wait for a file (a download, a logfile, etc) to stop changing, then do something. As written it will just return to the prompt, but you could add a "; echo DONE" or whatever at the end. This just compares the full output of "ls" every 10 seconds, and keeps going as long as that output has changed since the last interval. If the file is being appended to, the size will change, and if it's being modified without growing, the timestamp from the "--full-time" option will have changed. The output of just "ls -l" isn't sufficient since by default it doesn't show seconds, just minutes. Waiting for a file to stop changing is not a very elegant or reliable way to measure that some process is finished - if you know the process ID there are much better ways. This method will also give a false positive if the changes to the target file are delayed longer than the sleep interval for any reason (network timeouts, etc). But sometimes the process that is writing the file doesn't exit, rather it continues on doing something else, so this approach can be useful if you understand its limitations.


    1
    while [ "$(ls -l --full-time TargetFile)" != "$a" ] ; do a=$(ls -l --full-time TargetFile); sleep 10; done
    dmmst19 · 2015-05-09 03:19:49 1
  • This command does a basic find with size. It also improves the printout given (more clearer then default) Adjusting the ./ will alter the path. Adjusting the "-size +100000k" will specify the size to search for. Show Sample Output


    0
    find ./ -type f -size +100000k -exec ls -lh {} \; 2>/dev/null| awk '{ print $8 " : " $5}'
    Goez · 2012-01-21 04:19:35 0
  • This requires a version of GNU find that supports the -exec {} + action, but it seems more straightforward than the versions already posted. Show Sample Output


    0
    find . -type f -exec ls -shS {} + | head -10
    erichamion · 2012-07-28 17:21:46 0
  • "-exec" ftw.


    0
    find . -type f -exec du -sh {} + | sort -hr | head
    mrfixit42 · 2012-08-03 04:24:36 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

check web server port 80 response header

Count threads of a jvm process
if you have problem threads problem in tomcat

Find broken symlinks in the current directory and its subdirectories.
This is best run as root to avoid permission denials that can produce false positives. Obviously you can specify a directory in the usual way: $ find -L dirname -type l I can't remember where I read about this or who deserves the credit for it. The find(1) manual page hints strongly toward it, however.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

list all crontabs for users
additionally use "find /etc/cron*" for cronscripts

Backup all MySQL Databases to individual files
Backups all MySQL databases to individual files. Can be put into a script that grabs current date so you have per day backups.

Get own public IP address
Plain Text Ip Output, independent of Layout change.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Find usb device in realtime
Using this command you can track a moment when usb device was attached.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: