What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




Commands using ls from sorted by
Terminal - Commands using ls - 452 results
ls -w 1 > list.m3u
2010-08-27 07:03:17
User: Tungmar
Functions: ls

A short variant if you have only one directory whit only audio files in it.

find . -type f -mtime -1 \! -type d -exec ls -l {} \;
for crt in $(locate -r '.+\.crt' | grep -v "/usr/share/ca-certificates/"); do ls -la $crt; done
2010-08-23 12:22:48
User: udog
Functions: grep locate ls
Tags: openssl locate

Finds all cert files on a server and lists them, finding out, which one is a symbolic link and which is true.

You want to do this when a certificate expires and you want to know which files to substitute with the new cert.

man $(ls -1 /usr/share/man/man?/ | shuf -n1 | cut -d. -f1)
2010-08-20 23:36:10
User: dooblem
Functions: cut ls man
Tags: man sort shuf

Another one.

Maybe not the quicker because of the sort command, but it will also look in other man sections.

updated with goodevilgenius 'shuf' idea

man $(ls /bin | shuf | head -1)
2010-08-20 23:12:51
Functions: head ls man
Tags: man

I'm not sure why you would want to do this, but this seems a lot simpler (easier to understand) than the version someone submitted using awk.

man $(ls /bin | sed -n $((RANDOM % $(ls /bin | wc -l) + 1))p)
2010-08-20 17:15:33
User: putnamhill
Functions: ls man sed wc
Tags: man sed ls wc random

Great idea camocrazed. Another twist would be to display a different man page based on the day of the year. The following will continuously cycle through all man pages:

man $(ls /bin | sed -n $(($(date +%j) % $(ls /bin | wc -l)))p)
dir="/bin"; man $(ls $dir |sed -n "$(echo $(( $RANDOM % $(ls $dir |wc -l | awk "{ print $1; }" ) + 1 )) )p")
2010-08-20 16:31:50
User: camocrazed
Functions: dir ls man sed
Tags: man sed awk echo wc

Broaden your knowledge of the utilities available to you in no particular order whatsoever! Then use that knowledge to create more nifty one-liners that you can post here. =p

Takes a random number modulo the number of files in $dir, prints the filename corresponding to that number, and passes it as an argument to man.

ls --quoting-style={escape,shell,c}
ls | sed 's/.*/"&"/'
2010-08-17 15:38:51
User: putnamhill
Functions: ls sed
Tags: sed ls

Looks like you're stuck with sed if your ls doesn't have a -Q option.

ls -Q
ls | sed 's,\(.*\),"\1",'
2010-08-17 14:27:27
User: randy909
Functions: ls sed

I had a file named " " (one space) and needed a way to see what the real filename was so I could remove it. sed to the rescue.

ls | while read -r FILE; do mv -v "$FILE" `echo "prependtext$FILE" `; done
2010-08-14 14:19:18
User: IgnitionWeb
Functions: ls mv read
Tags: echo mv prepen

Prepends all directory items with "prependtext"

ls | while read -r FILE; do mv -v "$FILE" `echo $FILE | tr -d ' '`; done
2010-08-14 14:10:48
User: IgnitionWeb
Functions: ls mv read tr
Tags: space echo while tr

all files in the directory get moved, in doing so the new name of the file is the original name with out spaces (using translate command)

ls | perl -lne '++$x{lc $1} if /[.](.+)$/ }{ print for keys %x'
2010-08-13 20:05:15
User: recursiverse
Functions: ls perl

All with only one pipe. Should be much faster as well (sort is slow). Use find instead of ls for recursion or reliability.

Edit: case insensitive

ls -Xp /path/to/dir | grep -Eo "\.[^/]+$" | uniq
2010-08-12 16:32:54
User: karpoke
Functions: grep ls
Tags: uniq ls grep

If we want files with more than one extension, like .tar.gz, only appear the latest, .gz:

ls -Xp /path/to/dir | grep -Eo "\.[^./]+$" | uniq
rsync -av --link-dest=$(ls -1d /backup/*/ | tail -1) /data/ /backup/$(date +%Y%m%d%H%M)/
2010-08-05 19:36:24
User: dooblem
Functions: date ls rsync tail
Tags: backup rsync

'data' is the directory to backup, 'backup' is directory to store snapshots.

Backup files on a regular basis using hard links. Very efficient, quick. Backup data is directly available.

Same as explained here :


in one line.

Using du to check the size of your backups, the first backup counts for all the space, and other backups only files that have changed.

ls -d $PWD/*
ls | sed s#^#$(pwd)/#
2010-08-04 20:47:44
User: randy909
Functions: ls sed

This version is a bit more portable although it isn't extended as easily with '-type f' etc. On AIX the find command doesn't have -maxdepth or equivalent.

gominify() { if [ $# -ne 2 ]; then echo 'gominify < src > < dst >'; return; fi; s="$1"; d="$2"; java -jar yui.jar $s >$d; if [ $? == 0 ]; then a=$( ls -sh $s | awk '{print $1}' ); b=$( ls -sh $d | awk '{print $1}' ); echo "Saved $s ($a) to $d ($b)"; fi;}
2010-08-03 10:19:24
User: meathive
Functions: awk echo ls

This command, or a derivative like it, is a must-have if you're a server administrator interested in website optimization: https://kinqpinz.info/?%C2%B6=287a7ba6

Command requires Yahoo's YUI, find it here: http://developer.yahoo.com/yui/

cat $(ls -c | grep ogg | tac ) > directory/test.ogg
sudo ls -l $(eval echo "/proc/{$(echo $(pgrep java)|sed 's/ /,/')}/fd/")|grep log|sed 's/[^/]* //g'|xargs -r tail -f
2010-07-30 18:20:00
User: vutcovici
Functions: echo eval grep ls sed sudo tail xargs

Tail all logs that are opened by all java processes. This is helpful when you are on a new environment and you do not know where the logs are located. Instead of java you can put any process name. This command does work only for Linux.

The list of all log files opened by java process:

sudo ls -l $(eval echo "/proc/{$(echo $(pgrep java)|sed 's/ /,/')}/fd/")|grep log|sed 's/[^/]* //g'
ls !(*.gz)
2010-07-29 23:47:26
User: c0t0d0
Functions: ls
Tags: ls glob

Negative shell globs already come with bash. Make sure to turn on extended pattern matching with 'shopt -e extglob'.

ls -I "*.gz"
2010-07-29 22:40:19
User: CodSpirit
Functions: ls
Tags: ls glob

Hides some entries from listing.

ls *[^.gz]
2010-07-29 20:25:48
User: elofland
Functions: ls
Tags: ls glob

I've been looking for a way to do this for a while, get a not pattern for shell globs. This works, I'm using to grab logs from a remote server via scp.

ls -la | grep $(date +%Y-%m-%d) | egrep -v -e '\.{1,2}' | sed "s/.*\:[0-9]\{2\} \(.\+\)$/\\1/g"