What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.




All commands from sorted by
Terminal - All commands - 11,926 results
rename -fc *
find -not -empty -type f -printf "%-30s'\t\"%h/%f\"\n" | sort -rn -t$'\t' | uniq -w30 -D | cut -f 2 -d $'\t' | xargs md5sum | sort | uniq -w32 --all-repeated=separate
2014-10-19 02:00:55
User: fobos3
Functions: cut find md5sum sort uniq xargs

Finds duplicates based on MD5 sum. Compares only files with the same size. Performance improvements on:

find -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate

The new version takes around 3 seconds where the old version took around 17 minutes. The bottle neck in the old command was the second find. It searches for the files with the specified file size. The new version keeps the file path and size from the beginning.

H="--header"; wget $H="Accept-Language: en-us,en;q=0.5" $H="Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" $H="Connection: keep-alive" -U "Mozilla/5.0 (Windows NT 5.1; rv:10.0.2) Gecko/20100101 Firefox/10.0.2" --referer=urlhere
lftp -u user,pwd -e "set sftp:connect-program 'ssh -a -x -T -c arcfour -o Compression=no'; mirror -v -c --loop --use-pget-n=3 -P 2 /remote/dir/ /local/dir/; quit" sftp://remotehost:22
2014-10-17 00:29:34
User: colemar
Functions: lftp

Mirror a remote directory using some tricks to maximize network speed.

lftp:: coolest file transfer tool ever

-u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa)

-e: execute internal lftp commands

set sftp:connect-program: use some specific command instead of plain ssh


-a -x -T: disable useless things

-c arcfour: use the most efficient cipher specification

-o Compression=no: disable compression to save CPU

mirror: copy remote dir subtree to local dir

-v: be verbose (cool progress bar and speed meter, one for each file in parallel)

-c: continue interrupted file transfers if possible

--loop: repeat mirror until no differences found

--use-pget-n=3: transfer each file with 3 independent parallel TCP connections

-P 2: transfer 2 files in parallel (totalling 6 TCP connections)

sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate)

You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network.

If the files are compressible removing "-o Compression=n" can be beneficial.

Better create an alias for the command.

sudo hdparm -B 200 /dev/sda
ls | tr '[[:punct:][:space:]]' '\n' | grep -v "^\s*$" | sort | uniq -c | sort -bn
2014-10-14 09:52:28
User: qdrizh
Functions: grep ls sort tr uniq
Tags: sort uniq ls grep tr

I'm sure there's a more elegant sed version for the tr + grep section.

uname -p
youtube-dl -tci --write-info-json "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
2014-10-13 21:18:34
User: wires

Download video files from a bunch of sites (here is a list https://rg3.github.io/youtube-dl/supportedsites.html).

The options say: base filename on title, ignores errors and continue partial downloads. Also, stores some metadata into a .json file plz.

Paste youtube users and playlists for extra fun.

Protip: git-annex loves these files

gcloud components list | grep "^| Not" | sed "s/|\(.*\)|\(.*\)|\(.*\)|/\2/" | xargs echo gcloud components update
2014-10-13 20:52:25
User: wires
Functions: echo grep sed xargs

Google Cloud SDK comes with a package manager `gcloud components` but it needs a bit of `sed` to work. Modify the "^| Not" bit to change the package selection. (The gcloud --format option is currently broken)

ip a s eth0 | awk -F'[/ ]+' '/inet[^6]/{print $3}'
dd if=/dev/hda | ssh root@ 'dd of=/root/server.img'
2014-10-13 13:43:47
User: suyashjain
Functions: dd ssh

By this command you can take the snapshot of you harddisk (full) and create the image , the image will be directly store on remote server through ssh. Here i am creating the image of /dev/hda and saving it at as /root/server.img.

cat /etc/httpd/logs/access.log | awk '{ print $6}' | sed -e 's/\[//' | awk -F'/' '{print $1}' | sort | uniq -c
2014-10-13 13:39:53
User: suyashjain
Functions: awk cat sed sort uniq

The command will read the apache log file and fetch the virtual host requested and the number of requests.

sed -e '/ s/^;//' -i test.txt
2014-10-13 13:37:53
User: suyashjain
Functions: sed
Tags: sed

This sed command will search for in all lines of test.txt and replace comment symbol ";" . You can use it for other purpose also.

psql -U quassel quassel -c "SELECT message FROM backlog ORDER BY time DESC LIMIT 1000;" | grep my-query
2014-10-12 19:53:06
User: Tatsh
Functions: grep

Replace the credentials to psql if necessary, and the my-query part with your query.

curl -s http://pages.cs.wisc.edu/~ballard/bofh/bofhserver.pl |grep 'is:' |awk 'BEGIN { FS=">"; } { print $10; }'
2014-10-10 21:17:33
User: toj
Functions: awk grep
Tags: curl BOFH

Sure, it's dirty, but it's quick, it only displays the excuse, and it works.

ip addr show enp3s0 | awk '/inet[^6]/{print $2}' | awk -F'/' '{print $1}'
for f in */*.ape; do avconv -i "$f" "${f%.ape}.flac"; done
2014-10-10 12:33:00
User: qdrizh

Converts all monkey audio files below currently directory to FLAC.

For only current directory, use `for f in *.ape; do avconv -i "$f" "${f%.ape}.flac"; done`

To remove APE files afterward, use `rm */*.ape`

mtr www.google.com
firefox $(grep -i ^url=* file.url | cut -b 5-)
2014-10-08 05:56:27
User: nachos117
Functions: cut grep

This command will use grep to read the shortcut (which in the above examle is file.url), and filter out all but the only important line, which contains the website URL, and some extra characters that will need to be removes (for example, URL=http://example.com). The cut command is then used to get rid of the URL= at the beginning. The output is then piped into Firefox, which should interpret the it as a web URL to be opened. Of course, you can replace Firefox with any other broswer. Tested in bash and sh.

egrep -wi --color 'warning|error|critical'
alias lp="echo -n \"some text to copy\" | pbcopy; sleep 120 && echo -n \"done\" | pbcopy &"
2014-10-05 19:43:49
User: wsams
Functions: alias
Tags: alias pbcopy

This alias is useful if you need to use some text often. Executing the alias will copy the text into your clipboard and then remove it after X seconds.

eog someimg.jpg
url=`curl http://proxybay.info/ | awk -F'href="|" |">|</' '{for(i=2;i<=NF;i=i+4) print $i,$(i+2)}' | grep follow|sed 's/^.\{19\}//'|shuf -n 1` && firefox $url
2014-10-04 19:08:13
User: dunryc
Functions: awk grep sed

polls the pirate bay mirrors list and chooses a random site and opens it for you in firefox