What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using file from sorted by
Terminal - Commands using file - 147 results
for i in $(file * | grep broken | cut -d : -f 1); do rm $i; done
for file in *.png; do mogrify -trim "$file"; done
for file in `ls *.png`; do convert -trim $file $file; done
for file in `cat urls.txt`; do echo -n "$file " >> log.txt; curl --head $file >> log.txt ; done
2010-10-19 02:54:13
User: Glutnix
Functions: echo file

urls.txt should have a fully qualified url on each line

prefix with

rm log.txt;

to clear the log

change curl command to

curl --head $file | head -1 >> log.txt

to just get the http status

file /bin/* | msort -j -l -n-1 -n2 2> /dev/null
2010-10-05 00:37:33
User: b_t
Functions: file
Tags: sort msort


1) -n-1 means sort key is the last field

2) -l is important if each separate record is on a new line (usually so for text files)

3) -j tells msort not to create log file (msort.log) in the working directory

4) may need to install msort package.

5) msort does lot more. Check man msort

rd(){ while read a ;do printf "$a\n";sleep ${1-1};done ;} # usage: rd < file ; or ... | rd
2010-10-03 04:16:03
User: argv
Functions: file printf read sleep

usage examples

ls largedir |rd

lynx -dump largewebsite.com |rd

rd < largelogfile

file=orig.ps; for i in $(seq `grep "Pages:" $file | sed 's/%%Pages: //g'`); do psselect $i $file $i\_$file; done
2010-09-24 19:44:32
User: damncool
Functions: file sed seq

splits a postscript file into multiple postscript files. for each page of the input file one output file will be generated. The files will be numbered for example 1_orig.ps 2_orig.ps ...

The psselect commad is part of the psutils package

for file in *.jpg; do convert "$file" -resize 800000@ -quality 80 "small.$file"; done
2010-09-13 19:06:14
User: grinob
Functions: file
Tags: xargs convert

Convert all jpegs in the current directory into ~1024*768 pixels and ~ 150 KBytes jpegs

declare -i i=0 ; for file in * ; do i=$[$i+1] ; mv "$file" $i; done
declare -i i; i=0; for file in *; do i=`expr $i+1`; mv "$file" $i; done;
2010-08-26 12:24:38
User: themiurgo
Functions: file mv

Renames files in a directory to incremental numbers, following alphabetic order. The command does not maintain extensions.

file /music/dir/* | grep -v 44.1 | sed 's/:.*//g' | grep .mp3 | { while IFS= read; do filebak="\"$REPLY.original\""; file="\"$REPLY\""; mv $file $filebak; sox -t mp3 $filebak $file rate 44k; done; };
2010-08-12 21:53:28
User: IgnitionWeb
Functions: file grep mv sed
Tags: mp3 sox resample

This heavy one liner gets all the files in the "/music/dir/" directory and filters for non 44.1 mp3 files. After doing this it passes the names to sox in-order to re-sample those files. The original files are left just in case.

file=ftp://ftp.gimp.org/pub/gimp/v2.6/gimp-2.6.10.tar.bz2; ssh server "wget $file -O -" > $PWD/${file##*/}
2010-08-02 15:59:45
User: michaelmior
Functions: file ssh
Tags: ssh bash download

This command will download $file via server. I've used this when FTP was broken at the office and I needed to download some software packages.

for file in $(ls /usr/bin ) ; do man -w $file 2>> nomanlist.txt >/dev/null ; done
2010-07-26 19:39:53
User: camocrazed
Functions: file ls man
Tags: man

This takes quite a while on my system. You may want to test it out with /bin first, or background it and keep working.

If you want to get rid of the "No manual entry for [whatever]" and just have the [whatever], use the following sed command after this one finishes.

sed -n 's/^No manual entry for \(.*\)/\1/p' nomanlist.txt
gophpdoc() { if [ $# -lt 2 ]; then echo $0 '< file > < title > [ pdf ]'; return; fi; if [ "$3" == 'pdf' ]; then ot=PDF:default:default; else ot=HTML:frames:earthli; fi; phpdoc -o $ot -f "$1" -t docs -ti "$2" }
2010-06-09 01:15:04
User: meathive
Functions: echo file

A shortcut to generate documentation with phpdoc. Defaults to HTML; optionally to PDF if third argument is given. Stores documentation in cwd under ./docs/. I forget the syntax to the output, -o, option, so this is easier.

cut -d'/' -f3 file | sort | uniq -c
2010-05-23 16:02:51
User: rubenmoran
Functions: cut file sort uniq

count the times a domain appears on a file which lines are URLs in the form http://domain/resource.

grep -Eo \([0-9]\{1,3\}[\.]\)\{3\}[0-9] file | sort | uniq
cat file | sed -n -r '/^100$|^[0-9]{1,2}$/p'
2010-05-15 19:15:56
User: voyeg3r
Functions: cat file sed

-r to use extended regex

^ begin line

| alternative

get 100 or 0-9 one or two times

for dir in $(find -type d ! -name CVS); do for file in $(find $dir -maxdepth 1 -type f); do rm $file; cvs delete $file; done; done
2010-04-27 16:03:33
User: ubersoldat
Functions: cvs dir file find rm
Tags: bash cvs delete rm

This will search all directories and ignore the CVS ones. Then it will search all files in the resulting directories and act on them.

find ~/.mozilla/firefox/*/Cache -exec file {} \; | awk -F ': ' 'tolower($2)~/mpeg/{print $1}'
2010-04-19 06:59:55
User: sata
Functions: awk file find

Grab a list of MP3s (with full path) out of Firefox's cache

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the *full path* of those tunes straight out of FF's cache in a clean list.

Shorter and Intuitive version of the command submitted by (TuxOtaku)

for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
2010-04-11 23:14:18
User: TuxOtaku
Functions: awk file grep sed

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;)

for file in *.flac; do flac -cd "$file" | lame -q 0 --vbr-new -V 0 - "${file%.flac}.mp3"; done
LC_ALL=C sort file | uniq -c | sort -n -k1 -r
for file in *.flac; do $(flac -cd "$file" | lame -h - "${file%.flac}.mp3"); done
2010-03-08 13:37:25
User: schmiddim
Functions: file

make sure that flac and lame are installed

sudo apt-get install lame flac

file -L <library> | grep -q '64-bit' && echo 'library is 64 bit' || echo 'library is 32 bit'
2010-03-07 06:31:35
User: infinull
Functions: echo file grep
Tags: bash

file displays a files type

the -L flag means follow sym-links (as libraries are often sym-linked to another this behavior is likely preferred)

more complex behavior (*two* grep commands!) could be used to determine if the file is or is not a shared library.

findopen() { local PS3="select file: "; select file in $(find "$1" -iname "$2"); do ${3:-xdg-open} $file; break; done }
2010-02-28 02:28:59
User: quigybo
Functions: file find

lists the files found by find, waits for user input then uses xdg-open to open the selected file with the appropriate program.

usage: findopen path expression [command]

With the third optional input you can specify a command to use other than xdg-open, for example you could echo the filename to stdout then pipe it to another command.

To get it to work for files with spaces it gets a bit messier...

findopen() { files=( $(find "$1" -iname "$2" | tr ' ' '@') ); select file in "${files[@]//@/ }"; do ${3:-xdg-open} "$file"; break; done }

You can replace the @ with any character that probably wont be in a file name.