What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands using ls from sorted by
Terminal - Commands using ls - 463 results
ls -lh file-name | awk '{ print $5}'
ls -lt | awk '{sum+=$5} END {print sum}'
2013-07-03 20:12:54
User: martinmorono
Functions: awk ls

Use awk to sum and print the space used by a group of files.

It works well as long as the space used is not bigger than 79094548.80...

I found that upper limit when trying to find out what was the total amount of recoverable space from a set of directories:

user@servername:/home/user/scripts>for dirName in aleph_bin aleph_sh aleph_work dailycheck INTERFAZ ; do echo "${dirName} = $(cat /tmp/purge_ocfs_dir.*.log | awk '{sum+=$5} END {printf "%4.2f", sum}') "; done

aleph_bin = 79094548.80

aleph_sh = 79094548.80

aleph_work = 79094548.80

dailycheck = 79094548.80

INTERFAZ = 79094548.80

In the worst case scenario, the total number might be almost 137G.

user@servername:/home/user/scripts>df -h /ocfs/*

Filesystem Size Used Avail Use% Mounted on


137G 38G 99G 28% /ocfs/aleph_bin


137G 38G 99G 28% /ocfs/aleph_sh


280G 135G 146G 49% /ocfs/aleph_work


137G 38G 99G 28% /ocfs/dailycheck


137G 38G 99G 28% /ocfs/INTERFAZ

Any suggestion about how to get the correct amount of space for total over 80 Mbytes?

ls -d .*"/" *"/"
ls -R | wc -l
ls -1 | while read file; do new_file=$(echo $file | sed s/\ /_/g); mv "$file" "$new_file"; done
ls | paste --delimiters='*' - ./zzz | awk ' BEGIN{FS="*";} { system("mv " $1 " \"" $2 "\"") }'
2013-05-13 15:44:07
User: skilowatt
Functions: awk ls paste

Rename all files in current directory by names from text file 'zzz'

ls *.jpg | xargs -n1 -i cp {} /external-hard-drive/directory
svn ls -R | egrep -v -e "\/$" | xargs svn blame | awk '{count[$2]++}END{for(j in count) print count[j] "\t" j}' | sort -rn
2013-05-03 01:45:12
User: kurzum
Functions: awk egrep ls sort xargs
Tags: svn count

This one has a better performance, as it is a one pass count with awk. For this script it might not matter, but for others it is a good optiomization.

ls -lahS $(find / -type f -size +10000k)
svn ls -R | egrep -v -e "\/$" | tr '\n' '\0' | xargs -0 svn blame | awk '{print $2}' | sort | uniq -c | sort -nr
2013-04-10 19:37:53
User: rymo
Functions: awk egrep ls sort tr uniq xargs
Tags: svn count

make usable on OSX with filenames containing spaces. note: will still break if filenames contain newlines... possible, but who does that?!

ssh -t myserver.org 'sudo ls /etc'
2013-04-09 04:23:37
User: patko
Functions: ls ssh
Tags: ssh sudo

This command will ask for remote sudo password before executing a remote command.

ls -R | grep ":$" | sed -e 's/:$//' -e 's/[^-][^\/]*\//--/g' -e 's/^/ /' -e 's/-/|/'
ls -lad
2013-04-03 09:58:31
User: techie
Functions: ls
Tags: ls

This will show you the permissions on the directory you are currently in

ps aux | grep [process] | awk '{print $2}' | xargs -I % ls /proc/%/fd | wc -l
while true; do ls -all myfile; spleep 1; clear; done
2013-03-26 09:13:19
User: ivodeblasi
Functions: ls

Sometime you need to monitor file or direcory change in dimension or other attributes. This command output file (called myfile in the example) attributes in the top of the screen, updating each 1 second.

You should change update time, command ( e.g., ls -all ) or target ( myfile, mydir, etc...).

ls -tl **/*(om[1,20])
2013-03-24 00:14:03
User: khayyam
Functions: ls
Tags: ls zsh

zsh globbing and glob qualifier:

'**/*' = recursive

om = ouput by modification (last access)

[1,20] = twenty files.

The '-t' switch is provided to ls so that the files are ordered with the most recent at the top. For a more 'find' like output the following can be used.

print -rl **/*(om[1,20])

ls -Sh **/*(.Lm+100) | tail -5
2013-03-21 20:22:11
User: khayyam
Functions: ls tail
Tags: tail ls zsh

zsh: list of files sorted by size, greater than 100mb, head the top 5. '**/*' is recursive, and the glob qualifiers provide '.' = regular file, 'L' size, which is followed by 'm' = 'megabyte', and finally '+100' = a value of 100

find -type f | xargs ls -1tr
find . -type f -exec ls -s {} \; | sort -n -r | head -5
ls -qahlSr # list all files in size order - largest last
2013-03-13 09:52:07
User: mpb
Functions: ls size

I find it useful, when cleaning up deleting unwanted files to make more space, to list in size order so I can delete the largest first.

Note that using "q" shows files with non-printing characters in name.

In this sample output (above), I found two copies of the same iso file both of which are immediate "delete candidates" for me.

ls -l | grep ^d
ls -qaltr # list directory in chronological order, most recent files at end of list
2013-02-25 14:14:44
User: mpb
Functions: at ls

I find it very handy to be able to quickly see the most recently modified/created files in a directory.

Note that the "q" option will reveal any files with non-printable characters in their filename.

ls -ltrhd */
ls -lT -rt | grep "^-" | awk 'BEGIN {START=2002} (START <= $9){ print $10 ;START=$9 }' | tail -1
2013-02-24 23:39:22
User: Glamdring
Functions: awk grep ls tail
Tags: ls date osx

On the Mac, the 'ls' function can sort based on month/day/time, but seems to lack ability to filter on the Year field (#9 among the long listed fields). The sorted list continuously increases the 'START' year for the most recently accessed set of files. The final month printed will be the highest month that appeared in that START year. The command does its magic on the current directory, and suitably discards all entries that are themselves directories. If you expect files dating prior to 2002, change the START year accordingly.