Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged size from sorted by
Terminal - Commands tagged size - 56 results
stat --format "%s" <file>
du -h | sort -hr
du -sh `pwd`
2011-10-30 08:47:23
User: djkee
Functions: du
Tags: size du pwd
0

Shows the size of the directory the command is ran in.

The size is in MB and GB.

There is no need to type the path, its the current working directory.

du -h /path | sort -h
parallel echo -n {}"\ "\;echo '$(du -s {} | awk "{print \$1}") / $(find {} | wc -l)' \| bc -l ::: *
du -h / | grep -w "[0-9]*G"
curl -s "$URL" |wc -c
2011-07-18 15:47:57
User: Mozai
Functions: wc
Tags: size curl http
2

Downloads the entire file, but http servers don't always provide the optional 'Content-Length:' header, and ftp/gopher/dict/etc servers don't provide a filesize header at all.

find /myfs -size +209715200c -exec du -m {} \; |sort -nr |head -10
2011-07-07 21:12:46
User: arlequin
Functions: du find head sort
2

Specify the size in bytes using the 'c' option for the -size flag. The + sign reads as "bigger than". Then execute du on the list; sort in reverse mode and show the first 10 occurrences.

SEARCHPATH=/var/; find $SEARCHPATH -type d -print0 | xargs -0 du -s 2> /dev/null | sort -nr | sed 's|^.*'$SEARCHPATH'|'$SEARCHPATH'|' | xargs du -sh 2> /dev/null
2011-07-06 08:21:58
User: moogmusic
Functions: du find sed sort xargs
-2

This command lists all the directories in SEARCHPATH by size, displaying their size in a human readable format.

wget --spider $URL 2>&1 | awk '/Length/ {print $2}'
2011-07-03 00:14:58
User: d3Xt3r
Functions: awk wget
5

- Where $URL is the URL of the file.

- Replace the $2 by $3 at the end to get a human-readable size.

Credits to svanberg @ ArchLinux forums for original idea.

Edit: Replaced command with better version by FRUiT. (removed unnecessary grep)

find -type f -exec du -sh {} + | sort -rh | head
find -type f | xargs -I{} du -s "{}" | sort -rn | head | cut -f2 | xargs -I{} du -sh "{}"
2011-01-04 11:10:56
User: glaudiston
Functions: cut du find head sort xargs
-1

Show the top file size in human readable form

find -type f | xargs -I{} du -sk "{}" | sort -rn | head
du -sk * | sort -rn | head
2011-01-03 10:49:40
User: EBAH
Functions: du sort
3

Also:

* find . -type f -exec ls -s {} \; | sort -n -r | head -5

* find . -type f -exec ls -l {} \; | awk '{print $5 "\t" $9}' | sort -n -r | head -5

find . -type f -size +500M -exec du {} \; | sort -n
2010-11-09 18:15:44
Functions: du find sort
Tags: size find
1

Greater than 500M and sorted by size.

find / -type f -size +500M
find / -type f -size +548576 -printf "%s:%h%f\n"
du -sh ~/*
2010-11-05 10:20:16
Functions: du
1

Display the size (human reading) of all the directories in your home path (~).

( di $TOFSCK -h ; /bin/umount $TOFSCK ; time /sbin/e2fsck -y -f -v $FSCKDEV ; /bin/mount $TOFSCK ) |& /bin/mail $MAILTO -s "$MAILSUB"
2010-10-24 00:35:23
User: px
Functions: time
1

This one-liner is for cron jobs that need to provide some basic information about a filesystem and the time it takes to complete the operation. You can swap out the di command for df or du if that's your thing. The |& redirections the stderr and stdout to the mail command.

How to configure the variables.

TOFSCK=/path/to/mount

FSCKDEV=/dev/path/device

or

FSCKDEV=`grep $TOFSCK /proc/mounts | cut -f1 -d" "`

MAILSUB="weekly file system check $TOFSCK "

find . -iname '*.jpg' -type f -print0 |perl -0 -ne '$a+=-s $_;END{print "$a\n"}'
2010-09-12 13:14:12
Functions: find perl
1

This deals nicely with filenames containing special characters and can deal with more files than can fit on a commandline. It also avoids spawning du.

watch 'find -maxdepth 1 -mindepth 1 -type d |xargs du -csh'
2010-05-19 13:13:57
User: shadycraig
Functions: du watch xargs
0

This command shows the size of directories below here, refreshing every 2s.

It will also track directories created after running the command (that what the find bit does).

i=0; for f in $(find ./ -size -10M -exec stat -c %s {} \; ); do i=$(($i + $f)); done; echo $i
find dir -size -1024k -type f -print0 | du --files0-from - -bc
2009-12-29 01:33:55
User: bhepple
Functions: dir du find
Tags: size sum
2

The original didn't use -print0 which fails on weird file names eg with spaces.

The original parsed the output of 'ls -l' which is always a bad idea.

find dir -size -1024k -type f | xargs -d $'\n' -n1 ls -l | cut -d ' ' -f 5 | sed -e '2,$s/$/+/' -e '$ap' | dc
2009-12-28 04:23:01
User: zhangweiwu
Functions: cut dir find ls sed xargs
Tags: size sum
1

The command gives size of all files smaller than 1024k, this information, together with disk usage, can help determin file system parameter (e.g. block size) or storage device (e.g. SSD v.s. HDD).

Note if you use awk instead of "cut| dc", you easily breach maximum allowed number of records in awk.

du -aB1m|awk '$1 >= 100'