Commands tagged file size (8)


  • 9
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -10
    netaxiz · 2012-06-30 10:03:31 5

  • 2
    find . -type f -print0 | xargs -0 du -h | sort -hr | head
    mesuutt · 2012-06-29 12:43:06 7
  • Here's a way to wait for a file (a download, a logfile, etc) to stop changing, then do something. As written it will just return to the prompt, but you could add a "; echo DONE" or whatever at the end. This just compares the full output of "ls" every 10 seconds, and keeps going as long as that output has changed since the last interval. If the file is being appended to, the size will change, and if it's being modified without growing, the timestamp from the "--full-time" option will have changed. The output of just "ls -l" isn't sufficient since by default it doesn't show seconds, just minutes. Waiting for a file to stop changing is not a very elegant or reliable way to measure that some process is finished - if you know the process ID there are much better ways. This method will also give a false positive if the changes to the target file are delayed longer than the sleep interval for any reason (network timeouts, etc). But sometimes the process that is writing the file doesn't exit, rather it continues on doing something else, so this approach can be useful if you understand its limitations.


    2
    while [ "$(ls -l --full-time TargetFile)" != "$a" ] ; do a=$(ls -l --full-time TargetFile); sleep 10; done
    dmmst19 · 2015-05-09 03:19:49 11
  • You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display. Best off putting this in your bashrc or bash_profile file Show Sample Output


    1
    largest() { dir=${1:-"./"}; count=${2:-"10"}; echo "Getting top $count largest files in $dir"; du -sx "$dir/"* | sort -nk 1 | tail -n $count | cut -f2 | xargs -I file du -shx file; }
    jhyland87 · 2013-01-21 09:45:21 8

  • 1
    find . -type f -size +100M
    chikondi · 2013-02-07 11:58:10 5
  • This command does a basic find with size. It also improves the printout given (more clearer then default) Adjusting the ./ will alter the path. Adjusting the "-size +100000k" will specify the size to search for. Show Sample Output


    0
    find ./ -type f -size +100000k -exec ls -lh {} \; 2>/dev/null| awk '{ print $8 " : " $5}'
    Goez · 2012-01-21 04:19:35 6
  • This requires a version of GNU find that supports the -exec {} + action, but it seems more straightforward than the versions already posted. Show Sample Output


    0
    find . -type f -exec ls -shS {} + | head -10
    erichamion · 2012-07-28 17:21:46 4
  • "-exec" ftw.


    0
    find . -type f -exec du -sh {} + | sort -hr | head
    mrfixit42 · 2012-08-03 04:24:36 5

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

shell function which allows you to tag files by creating symbolic links directories in a 'tags' folder.
shell function which allows you to tag files by creating symbolic links directories in a 'tags' folder. The tag function takes a tag name as its first argument, then a list of files which take that tag. The directory $HOME/tags/tagname will then hold symbolic links to each of the tagged files. This function was modified from bartonski's (http://www.commandlinefu.com/commands/view/10216) inspired by tmsu (found at https://bitbucket.org/oniony/tmsu/wiki/Home) with readlink function by flxndn (http://www.commandlinefu.com/commands/view/10222). Example: $ tag dog airedale.txt .shizturc weimeraner.pl This will create $HOME/tags/dog which contains symbolic links to airedale.txt .shizturc and weimeraner.pl

TCPDUMP & Save Capture to Remote Server w/ GZIP
NOTE: When opening the files you might need to strip the very top line with notepad++ as its a mistake header This is useful when the local machine where you need to do the packet capture with tcpdump doesn?t have enough room to save the file, where as your remote host does tcpdump -i eth0 -w - | ssh forge.remotehost.com -c arcfour,blowfish-cbc -C -p 50005 "cat - | gzip > /tmp/eth0.pcap.gz" Your @ PC1 doing a tcpdump of PC1s eth0 interface and its going to save the output @ PC2 who is called save.location.com to a file /tmp/eth0-to-me.pcap.gz again on PC2 More info @: http://www.kossboss.com/linuxtcpdump1

List all active access_logs for currently running Apache or Lighttpd process
Ever logged into a *nix box and needed to know which webserver is running and where all the current access_log files are? Run this one liner to find out. Works for Apache or Lighttpd as long as CustomLog name is somewhat standard. HINT: works great as input into for loop, like this: $ for i in `lsof -p $(netstat -ltpn|awk '$4 ~ /:80$/ {print substr($7,1,index($7,"/")-1)}')| awk '$9 ~ /access.log$/ {print $9| "sort -u"}'` ; do echo $i; done Very useful for triage on unfamiliar servers!

Do quick arithmetic on numbers from STDIN with any formatting using a perl one liner.
Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers.

list block devices
Shows all block devices in a tree with descruptions of what they are.

Get the Last tweet (Better than Twitter feed rrs)
I was using some twitter bots and twitter bloks my IP. Now im using some proxys and i use these command to validate the tweet was published.

Create named LUKS encrypted volume
You need to be root to do this. So check the command before running it. You enter the same password for Enter LUKS passphrase: Verify passphrase: Enter passphrase for /dev/loopn: ___ You can then copy the .img file to somewhere else. Loop it it with losetup -f IMAGENAME.img and then mount it with a file manager (eg nemo) or run mount /dev/loopn /media/mountfolder Acts similar to a mounted flash drive

grep (or anything else) many files with multiprocessor power
Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores. With parallel you only need -0 (and -print0) if your filenames contain a '\n'. Parallel is from https://savannah.nongnu.org/projects/parallel/

list block devices
Shows all block devices in a tree with descruptions of what they are.

Convert multiple pdf's to jpg in linux using the convert command


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: