Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using ls from sorted by
Terminal - Commands using ls - 448 results
man $(ls /bin | sed -n $((RANDOM % $(ls /bin | wc -l) + 1))p)
2010-08-20 17:15:33
User: putnamhill
Functions: ls man sed wc
Tags: man sed ls wc random
-2

Great idea camocrazed. Another twist would be to display a different man page based on the day of the year. The following will continuously cycle through all man pages:

man $(ls /bin | sed -n $(($(date +%j) % $(ls /bin | wc -l)))p)
dir="/bin"; man $(ls $dir |sed -n "$(echo $(( $RANDOM % $(ls $dir |wc -l | awk "{ print $1; }" ) + 1 )) )p")
2010-08-20 16:31:50
User: camocrazed
Functions: dir ls man sed
Tags: man sed awk echo wc
-2

Broaden your knowledge of the utilities available to you in no particular order whatsoever! Then use that knowledge to create more nifty one-liners that you can post here. =p

Takes a random number modulo the number of files in $dir, prints the filename corresponding to that number, and passes it as an argument to man.

ls --quoting-style={escape,shell,c}
ls | sed 's/.*/"&"/'
2010-08-17 15:38:51
User: putnamhill
Functions: ls sed
Tags: sed ls
-5

Looks like you're stuck with sed if your ls doesn't have a -Q option.

ls -Q
ls | sed 's,\(.*\),"\1",'
2010-08-17 14:27:27
User: randy909
Functions: ls sed
-2

I had a file named " " (one space) and needed a way to see what the real filename was so I could remove it. sed to the rescue.

ls | while read -r FILE; do mv -v "$FILE" `echo "prependtext$FILE" `; done
2010-08-14 14:19:18
User: IgnitionWeb
Functions: ls mv read
Tags: echo mv prepen
-4

Prepends all directory items with "prependtext"

ls | while read -r FILE; do mv -v "$FILE" `echo $FILE | tr -d ' '`; done
2010-08-14 14:10:48
User: IgnitionWeb
Functions: ls mv read tr
Tags: space echo while tr
-3

all files in the directory get moved, in doing so the new name of the file is the original name with out spaces (using translate command)

ls | perl -lne '++$x{lc $1} if /[.](.+)$/ }{ print for keys %x'
2010-08-13 20:05:15
User: recursiverse
Functions: ls perl
-3

All with only one pipe. Should be much faster as well (sort is slow). Use find instead of ls for recursion or reliability.

Edit: case insensitive

ls -Xp /path/to/dir | grep -Eo "\.[^/]+$" | uniq
2010-08-12 16:32:54
User: karpoke
Functions: grep ls
Tags: uniq ls grep
0

If we want files with more than one extension, like .tar.gz, only appear the latest, .gz:

ls -Xp /path/to/dir | grep -Eo "\.[^./]+$" | uniq
rsync -av --link-dest=$(ls -1d /backup/*/ | tail -1) /data/ /backup/$(date +%Y%m%d%H%M)/
2010-08-05 19:36:24
User: dooblem
Functions: date ls rsync tail
Tags: backup rsync
1

'data' is the directory to backup, 'backup' is directory to store snapshots.

Backup files on a regular basis using hard links. Very efficient, quick. Backup data is directly available.

Same as explained here :

http://blog.interlinked.org/tutorials/rsync_time_machine.html

in one line.

Using du to check the size of your backups, the first backup counts for all the space, and other backups only files that have changed.

ls -d $PWD/*
ls | sed s#^#$(pwd)/#
2010-08-04 20:47:44
User: randy909
Functions: ls sed
2

This version is a bit more portable although it isn't extended as easily with '-type f' etc. On AIX the find command doesn't have -maxdepth or equivalent.

gominify() { if [ $# -ne 2 ]; then echo 'gominify < src > < dst >'; return; fi; s="$1"; d="$2"; java -jar yui.jar $s >$d; if [ $? == 0 ]; then a=$( ls -sh $s | awk '{print $1}' ); b=$( ls -sh $d | awk '{print $1}' ); echo "Saved $s ($a) to $d ($b)"; fi;}
2010-08-03 10:19:24
User: meathive
Functions: awk echo ls
-2

This command, or a derivative like it, is a must-have if you're a server administrator interested in website optimization: https://kinqpinz.info/?%C2%B6=287a7ba6

Command requires Yahoo's YUI, find it here: http://developer.yahoo.com/yui/

cat $(ls -c | grep ogg | tac ) > directory/test.ogg
sudo ls -l $(eval echo "/proc/{$(echo $(pgrep java)|sed 's/ /,/')}/fd/")|grep log|sed 's/[^/]* //g'|xargs -r tail -f
2010-07-30 18:20:00
User: vutcovici
Functions: echo eval grep ls sed sudo tail xargs
-1

Tail all logs that are opened by all java processes. This is helpful when you are on a new environment and you do not know where the logs are located. Instead of java you can put any process name. This command does work only for Linux.

The list of all log files opened by java process:

sudo ls -l $(eval echo "/proc/{$(echo $(pgrep java)|sed 's/ /,/')}/fd/")|grep log|sed 's/[^/]* //g'
ls !(*.gz)
2010-07-29 23:47:26
User: c0t0d0
Functions: ls
Tags: ls glob
28

Negative shell globs already come with bash. Make sure to turn on extended pattern matching with 'shopt -e extglob'.

ls -I "*.gz"
2010-07-29 22:40:19
User: CodSpirit
Functions: ls
Tags: ls glob
7

Hides some entries from listing.

ls *[^.gz]
2010-07-29 20:25:48
User: elofland
Functions: ls
Tags: ls glob
1

I've been looking for a way to do this for a while, get a not pattern for shell globs. This works, I'm using to grab logs from a remote server via scp.

ls -la | grep $(date +%Y-%m-%d) | egrep -v -e '\.{1,2}' | sed "s/.*\:[0-9]\{2\} \(.\+\)$/\\1/g"
today=`date +%d`; ls -ltr | rm -f `nawk -v _today=$today '{ if($5 != 0 && $7 < _today) { print $9 } }'`
2010-07-29 13:47:19
User: alex__
Functions: ls rm
0

Delete all files that its size it's different than 0 and older than actuall day.

ls -l --time-style=+%Y-%m-%d | awk "/$(date +'%Y-%m-%d')/ {print \$7}"
2010-07-29 05:30:29
Functions: awk ls
1

This version eliminates the grep before the awk, which is always good. It works for GNU core utils and ensures that the date output of ls matches the format in the pattern match, regardless of locale, etc.

On BSD-based systems, you can easily eliminate both the grep and the awk:

find . -maxdepth 1 -Btime -$(date +%kh%lm) -type f

for file in $(ls /usr/bin ) ; do man -w $file 2>> nomanlist.txt >/dev/null ; done
2010-07-26 19:39:53
User: camocrazed
Functions: file ls man
Tags: man
-2

This takes quite a while on my system. You may want to test it out with /bin first, or background it and keep working.

If you want to get rid of the "No manual entry for [whatever]" and just have the [whatever], use the following sed command after this one finishes.

sed -n 's/^No manual entry for \(.*\)/\1/p' nomanlist.txt
rm $( ls | egrep -v 'abc|\s' )
2010-07-18 10:59:15
User: dbbolton
Functions: egrep ls rm
Tags: grep rm
-1

Really, you deserve whatever happens if you have a whitespace character in a file name, but this has a small safety net. The truly paranoid will use '-i'.

url=http://www.youtube.com/watch?v=V5bYDhZBFLA; youtube-dl -b $url; mplayer $(ls ${url##*=}*| tail -n1) -ss 00:57 -endpos 10 -vo gif89a:fps=5:output=output.gif -vf scale=400:300 -nosound
2010-07-18 02:11:39
User: zed
Functions: ls tail
12

requires "youtube-dl" -- sure you can do this with wget and some more obscurity but why waste your time when this great tool is available?

the guts consist of mplayer converting a video to a gif -- study this command and read the man page for more information

mplayer video.flv -ss 00:23 -endpos 6 -vo gif89a:fps=5:output=output.gif -vf scale=400:300 -nosound

generates a 6 second gif starting at 23 seconds of play time at 5 fps and a scale of 400x300

start time (-ss)/end time (-endpos) formats: 00:00:00.000

end time should be relative to start time, not absolute. i.e. -endpos 5 == seconds after 0:42 = 0:47 end point

play with fps and scale for lower gif sizes

the subshell is a solution for the -b flag on youtube-dl which downloads the best quality video, sometimes, which can be various video formats $(ls ${url##*=}*| tail -n1)