What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands using ls from sorted by
Terminal - Commands using ls - 461 results
find . -type f | xargs ls -ltrhg
2010-05-28 01:23:53
User: emacs
Functions: find ls xargs

find and normal files and list them sorting with modification time without group

l: with detailed information

t: sort with modification time

r: reverse order

h: show file's size in human-readable format, such as K(kilobytes), M(megabyes) etc.

g: do not show group

ls -rl --time-style=+%s * | sed '/^$/,/^total [0-9]*$/d' | sort -nk6
find . -type f -mtime -14 -exec ls -ltd \{\} \; | less
find . -type f -exec ls -tr {} +
2010-05-27 14:52:28
Functions: find ls

List all files from the current directory and subdirectories, sorted by modification time, oldest first.

ls -lt | more
2010-05-27 12:44:39
User: eludom
Functions: ls

Simple but useful; list files in the current directory in mtime order. Useful if you've been working on something and then take a day or two off.

rm-but() { ls -Q | grep -v "$1" | xargs rm -r ; }
2010-05-13 09:28:56
User: sata
Functions: grep ls rm xargs
rm-but() { ls -Q | grep -v "$1" | xargs rm -r ; }

Add this to your .bashrc file.

Then whenever you need to remove all files/directories but one from present working directory. Run:

rm-but <important-file-or-directory>


1. This doesn't affect the hidden files.

2. Argument is actually as string. And all files/directories having this string in there name are left untouched.

find / -type f -size +512000 | xargs ls -lh | awk '{ print $5 " " $6$7 ": " $9 }'
2010-05-12 17:21:12
User: johnss
Functions: awk find ls xargs

This is an updated version that some one provided me via another "find" command to find files over a certain size. Keep in mind you may have to mess around with the print values depending on your system to get the correct output you want. This was tested on FC and Cent based servers. (thanks to berta for the update)

s3cmd ls s3://bucket.example.com | s3cmd del `awk '{print $4}'`
ls | grep *.txt | while read file; do cat $file >> ./output.txt; done;
goyoutube() { d=/path/to/videos p=$d/playlist m=$d/*.mp4 f=$d/*.flv if [ "$1" == 'rand' ]; then ls -1 $m $f | shuf >$p else ls -1t $m $f >$p fi mplayer -geometry 500x400 -playlist $p }
2010-04-11 18:53:49
User: meathive
Functions: ls

newly downloaded videos



goyoutube rand

This command assumes you've already downloaded some YouTube .mp4 or .flv video files via other means. Requires 'shuf', or your own stdin shuffler.

ls -lS
open-command $(ls -rt *.type | tail -n 1)
2010-04-04 20:43:38
User: RBerenguel
Functions: ls tail

Change open-command and type to suit your needs. One example would be to open the last .jpg file with Eye Of Gnome:

eog $(ls -rt *.jpg | tail -n 1)

function wherepath () { for DIR in `echo $PATH | tr ":" "\n" | awk '!x[$0]++ {print $0}'`; do ls ${DIR}/$1 2>/dev/null; done }
2010-04-02 20:32:36
User: mscar
Functions: awk ls tr
Tags: find locate PATH

The wherepath function will search all the directories in your PATH and print a unique list of locations in the order they are first found in the PATH. (PATH often has redundant entries.) It will automatically use your 'ls' alias if you have one or you can hardcode your favorite 'ls' options in the function to get a long listing or color output for example.


'whereis' only searches certain fixed locations.

'which -a' searches all the directories in your path but prints duplicates.

'locate' is great but isn't installed everywhere (and it's often too verbose).

ls | egrep -v "[REGULAR EXPRESSION]" | xargs rm -v
2010-04-01 02:40:40
User: Saxphile
Functions: egrep ls rm xargs
Tags: files rm

This is a slight variation of an existing submission, but uses regular expression to look for files instead. This makes it vastly more versatile, and one can easily verify the files to be kept by running ls | egrep "[REGULAR EXPRESSION]"

find . -type f -iname '*.msh' -exec ls -lG {} \; | awk '{total = total + $4}END{print "scale=2;" total "/2^20"}' | bc
ls | while read filename; do tar -czvf "$filename".tar.gz "$filename"; rm "$filename"; done
2010-03-29 08:10:38
User: Thingymebob
Functions: ls read rm tar

Compresses each file individually, creating a $fileneame.tar.gz and removes the uncompressed version, usefull if you have lots of files and don't want 1 huge archive containing them all. you could replace ls with ls *.pdf to just perform the action on pdfs for example.

ls -d $(echo ${PATH//:/ }) > /dev/null
ls -l | grep ^-
ls -l | awk '{if (NR % 5 == 0) print "-- COMMIT --"; print}'
ls -l | sed "$(while (( ++i < 5 )); do echo "N;"; done) a -- COMMIT --"
2010-03-17 20:12:05
User: glaudiston
Functions: ls sed

specially usefull for sql scripts with insert / update statements, to add a commit command after n statements executed.

( last ; ls -t /var/log/wtmp-2* | while read line ; do ( rm /tmp/wtmp-junk ; zcat $line 2>/dev/null || bzcat $line ) > /tmp/junk-wtmp ; last -f /tmp/junk-wtmp ; done ) | less
2010-03-16 04:17:16
Functions: last ls read rm zcat

When your wtmp files are being logrotated, here's an easy way to unpack them all on the fly to see more than a week in the past. The rm is the primitive way to prevent symlink prediction attack.

ls -R .
ls -d */* | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs mv -t $(pwd)
2010-03-01 23:43:26
User: leovailati
Functions: ls mv sed xargs

You WILL have problems if the files have the same name.

Use cases: consolidate music library and unify photos (especially if your camera separates images by dates).

After running the command and verifying if there was no name issues, you can use

ls -d */ | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs rm -r

to remove now empty subdirectories.

ls -1 /lib/modules
2010-03-01 06:30:12
Functions: ls

no need for rpm, no need for piping to another command. also no real fu but lacking in unnecessary complexity and distro specific commands.

lsli() { ls -l --color "$@" | awk '{ for(i=9;i<NF;i++){ printf("%s ",$i) } printf("%s\n",$NF) }'; }
2010-02-23 15:05:28
User: quigybo
Functions: awk ls

displays the output of ls -l without the rest of the crud. pretty simple but useful.