What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using ls from sorted by
Terminal - Commands using ls - 467 results
ls ${my_dir:=/home}
2011-11-30 15:06:51
Functions: ls

Will use variable value (for variable $my_dir, in this case), an assign a default value if there is none.

figlet -f $(ls /usr/share/figlet/fonts/*.flf | shuf -n1) namakukingkong | cowsay -n -f $(ls /usr/share/cows/ | shuf -n1)
2011-11-25 13:54:06
Functions: ls
Tags: Linux

You need to have figlet(for font) and cowsay installed then add it to your .bashrc file.You can see it every time after start a new session.

function right { bc <<< "obase=8;ibase=2;$1"; }; touch foo; chmod $(right 111111011) foo; ls -l foo
2011-11-16 22:43:31
User: nerd
Functions: bc chmod ls touch

I simply find binary notation more straightforward to use than octal in this case.

Obviously it is overkill if you just 600 or 700 all of your files...

ls -l `whereis gcc`
2011-11-15 19:45:08
User: knathan54
Functions: ls
Tags: which ls zsh

whereis (1) - locate the binary, source, and manual page files for a command

Not actually better, just expanded a bit. The "whereis" command has the following output:

whereis gcc

gcc: /usr/bin/gcc /usr/lib/gcc /usr/bin/X11/gcc /usr/share/man/man1/gcc.1.gz

therefore the 'ls' error on first line, which could be eliminated with a little extra work.

ls -l =gcc
for f in $(ls -A ./dir); do echo -n $f && diff original.txt ./dir/$f | wc -l ; done | perl -ne 'my $h={}; while (<>) { chomp; if (/^(\S+?)\s*(\d+?)$/){$h->{$1}=$2;} }; for my $k (sort { $h->{$a} $h->{$b} } keys %$h ){ print "$k\t$h->{$k}\n"}'
ls -l `which gcc`
ls -1 $PATH*/* | xargs file | awk -F":" '!($2~/PDF document/){print $1}' |xargs rm -rf
ls -lFart |tail -n1
2011-10-17 19:49:14
User: jambino
Functions: ls tail
Tags: tail pipe ls

List all files in a directory in reverse order by modified timestamp. When piped through tail the user will see the most recent file name.

ls -ltp | sed '1 d' | head -n1
2011-10-17 16:21:15
Functions: head ls sed

wrap it in a function if you like...

lastfile () { ls -ltp | sed '1 d' | head -n1 }
find / -perm +6000 -type f -exec ls -ld {} \;
ls -Fart
2011-09-19 13:07:47
User: jambino
Functions: ls

It's both silly, and infinitely useful. Especially useful in logfile directories where you want to know what file is being updated while troubleshooting.

ls -saltS [dirname]
2011-09-18 22:03:11
User: ztank1013
Functions: ls

It lists files and folder under dirname adding at the beginning of each line the file allocated size in blocks (-s). It also sorts output by file size (-S) from bigger to smaller. Actually the -t option in that precise position does not give any effect... (challenge: can you tell me why?) but of course gives to the ls command some salty taste! :)

ls -l
ls -i1 filename
ls -trF | grep -v \/ | tail -n 1
2011-09-14 20:05:37
User: mrpollo
Functions: grep ls tail
Tags: find stat mtime

Sort by time and Reverse to get Ascending order, then display a marker next to the a file, negate directory and select only 1 result

myreadlink() { [ ! -h "$1" ] && echo "$1" || (local link="$(expr "$(command ls -ld -- "$1")" : '.*-> \(.*\)$')"; cd $(dirname $1); myreadlink "$link"; }
2011-09-13 11:02:27
User: keymon
Functions: cd command dirname echo ls

This is a equivalent to the GNU ' readlink' tool, but it supports following all the links, even in different directories.

An interesting alternative is this one, that gets the path of the destination file

myreadlink() { [ ! -h "$1" ] && echo "$1" || (local link="$(expr "$(command ls -ld -- "$1")" : '.*-> \(.*\)$')"; cd $(dirname $1); myreadlink "$link" | sed "s|^\([^/].*\)\$|$(dirname $1)/\1|"); }
ls -l /etc/**/*killall
2011-08-30 05:57:49
User: xeor
Functions: ls

This command will give you the same list of files as "find /etc/ -name '*killall' | xargs ls -l".

In a simpler format just do 'ls /etc/**/file'.

It uses shell globbing, so it will also work with other commands, like "cp /etc/**/sshd sshd_backup".

lsr() { find "${@:-.}" -print0 |sort -z |xargs -0 ls $LS_OPTIONS -dla; }
2011-08-15 03:10:58
User: h3xx
Functions: find ls sort xargs

Tells you everything you could ever want to know about all files and subdirectories. Great for package creators. Totally secure too.

On my Slackware box, this gets set upon login:

LS_OPTIONS='-F -b -T 0 --color=auto'


alias ls='/bin/ls $LS_OPTIONS'

which works great.

ls -1d */
2011-08-10 05:40:15
User: weldabar
Functions: ls

omit the 1 (one) if you don't need one-per-line

cd $(ls -ltr|grep ^d|head -1|sed 's:.*\ ::g'|tail -1)
2011-08-10 03:39:35
Functions: cd grep head ls sed tail

Replace the head -1 with head -n that is the n-th item you want to go to.

Replace the head with tail, go to the last dir you listed.

You also can change the parameters of ls.

ls -l | grep ^d | sed 's:.*\ ::g'
ls -1d */
ls -l | grep ^d | sed 's:.*\ ::g'
2011-08-06 23:52:46
User: LinuxMan
Functions: grep ls sed
Tags: bash sed ls grep

Normally, if you just want to see directories you'd use brianmuckian's command 'ls -d *\', but I ran into problems trying to use that command in my script because there are often multiple directories per line. If you need to script something with directories and want to guarantee that there is only one entry per line, this is the fastest way i know

mplayer $(ls -l /proc/$(pgrep -f flash)/fd/* |grep Flash | cut -d" " -f8)