What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands using ls from sorted by
Terminal - Commands using ls - 461 results
ls .[!.]*
2009-09-29 13:50:13
User: danam
Functions: ls

Although rm is protected against it, there are many commands that would wreak havoc on entering the obvious ".*" to address "dot-files". This sweet little expression excludes the dirs "." and ".." that cause the problems.

enscript jrandom.txt -o - | ps2pdf - ~/tmp/jrandom.pdf (from file) or: ls | enscript -o - | ps2pdf - ~/tmp/ls.pdf (from stdout)
tar -cvf /dev/null . | while read i; do ls -l $i; done
2009-09-16 16:59:15
User: lbonanomi
Functions: ls read tar

I find the ouput of ls -lR to be un-satisfying (why is the path data up there?) and find syntax to be awkward. Running 'du -a' means you will have likely to trim-off filesize data before feeding filenames to the next step in the pipe.

ls -lt|grep ^-|awk 'NR>5 { print $8 }'|xargs -r rm
ls -t | tail +6 | xargs rm
ls -t | awk 'NR>5 {system("rm \"" $0 "\"")}'
2009-09-16 04:58:08
User: haivu
Functions: awk ls
Tags: awk ls

I have a directory containing log files. This command delete all but the 5 latest logs. Here is how it works:

* The ls -t command list all files with the latest ones at the top

* The awk's expression means: for those lines greater than 5, delete.

cat /var/lib/dpkg/info/*.list > /tmp/listin ; ls /proc/*/exe |xargs -l readlink | grep -xvFf /tmp/listin; rm /tmp/listin
2009-09-09 18:09:14
User: kamathln
Functions: cat grep ls readlink rm xargs
Tags: Debian find dpkg

This helped me find a botnet that had made into my system. Of course, this is not a foolproof or guarantied way to find all of them or even most of them. But it helped me find it.

ls -ldct /lost+found |awk '{print $6, $7}'
ls -lct /etc/ | tail -1 | awk '{print $6, $7, $8}'
2009-09-04 16:52:50
User: peshay
Functions: awk ls tail

shows also time if its the same year or shows year if installed before actual year and also works if /etc is a link (mac os)

ls -lct /etc | tail -1 | awk '{print $6, $7}'
2009-09-03 10:26:37
User: MrMerry
Functions: awk ls tail

Show time and date when you installed your OS.

ls -shF --color
2009-09-03 05:45:33
User: Viperlin
Functions: ls

use manpages, they give you "ultimate commands"

"ls -SshF --color" list by filesize (biggest at the top)

"ls -SshFr --color" list by filesize in reverse order (biggest at the bottom)

ls *.c | while read F; do gcc -Wall -o `echo $F | cut -d . -f 1 - ` $F; done
2009-08-28 13:01:56
User: pichinep
Functions: cut gcc ls read

Compile *.c files with "gcc -Wall" in actual directory, using as output file the file name without extension.

ls | while read f; do mv "$f" "${f// /_}";done
find ./ -size +10M -type f -print0 | xargs -0 ls -Ssh1 --color
locate -e somefile | xargs ls -l
2009-08-23 13:16:59
User: nadavkav
Functions: locate ls xargs

use the locate command to find files on the system and verify they exist (-e) then display each one in full details.

sudo du -ks $(ls -d */) | sort -nr | cut -f2 | xargs -d '\n' du -sh 2> /dev/null
2009-08-17 22:21:09
User: Code_Bleu
Functions: cut du ls sort sudo xargs
Tags: disk usage

This allows the output to be sorted from largest to smallest in human readable format.

find . -type f -printf '%20s %p\n' | sort -n | cut -b22- | tr '\n' '\000' | xargs -0 ls -laSr
2009-08-13 13:13:33
User: fsilveira
Functions: cut find ls sort tr xargs
Tags: sort find ls

This command will find the biggest files recursively under a certain directory, no matter if they are too many. If you try the regular commands ("find -type f -exec ls -laSr {} +" or "find -type f -print0 | xargs -0 ls -laSr") the sorting won't be correct because of command line arguments limit.

This command won't use command line arguments to sort the files and will display the sorted list correctly.

ls -laR > /path/to/filelist
2009-08-12 17:53:40
User: shaiss
Functions: ls

Ever need to output an entire directory and subdirectory contents to a file? This is a simple one liner but it does the trick every time. Omit -la and use only -R for just the names

ls -1 *.jpg | while read fn; do export pa=`exiv2 "$fn" | grep timestamp | awk '{ print $4 " " $5 ".jpg"}' | tr ":" "-"`; mv "$fn" "$pa"; done
2009-08-10 00:52:22
User: axanc
Functions: awk export grep ls mv read tr

Renames all the jpg files as their timestamps with ".jpg" extension.

sudo du -sh $(ls -d */) 2> /dev/null
ls foo*.jpg | awk '{print("mv "$1" "$1)}' | sed 's/foo/bar/2' | /bin/sh
ls -pt1 | sed '/.*\//d' | sed 1d | xargs rm
2009-07-29 13:59:58
User: patko
Functions: ls sed xargs

Useful for deleting old unused log files.

ls -t1 | head -n1 | xargs tail -f
svn ls -R | egrep -v -e "\/$" | xargs svn blame | awk '{print $2}' | sort | uniq -c | sort -r
2009-07-29 02:10:45
User: askedrelic
Functions: awk egrep ls sort uniq xargs
Tags: svn count

I'm working in a group project currently and annoyed at the lack of output by my teammates. Wanting hard metrics of how awesome I am and how awesome they aren't, I wrote this command up.

It will print a full repository listing of all files, remove the directories which confuse blame, run svn blame on each individual file, and tally the resulting line counts. It seems quite slow, depending on your repository location, because blame must hit the server for each individual file. You can remove the -R on the first part to print out the tallies for just the current directory.

man -P cat ls > man_ls.txt
2009-07-27 13:09:24
User: alvinx
Functions: cat ls man

Output manpage as plaintext using cat as pager: man -P cat commandname

And redirect its stdout into a file: man -P cat commandname > textfile.txt

Example: man -P cat ls > man_ls.txt