What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using wget from sorted by
Terminal - Commands using wget - 262 results
wget --spider -v http://www.server.com/path/file.ext
wget `lynx -dump http://www.ebow.com/ebowtube.php | grep .flv$ | sed 's/[[:blank:]]\+[[:digit:]]\+\. //g'`
2009-08-02 14:09:53
User: spaceyjase
Functions: grep sed wget

I wanted all the 'hidden' .flv files from the http link in the command line; wget seemed appropriate, fed with output from lynx, grep the flv files and the normalised via sed (to remove the numeric bullet). Similar to the 'Grab mp3 files' fu. Replace link with your own, grep arg with something more interesting ;) See here for something along the same lines...


Hope you find it useful! Improvements welcome, naturally.

sudo wget -c "http://nmap.org/dist/nmap-5.00.tar.bz2" && bzip2 -cd nmap-5.00.tar.bz2 | tar xvf - && cd nmap-5.00 && ./configure && make && sudo make install
2009-07-26 11:36:53
User: hemanth
Functions: bzip2 cd make sudo tar wget

Just copy and paste the code in your terminal.

Note : sudo apt-get for debian versions , change as per your requirement .

Source : www.h3manth.com

wget -q -O - 'URL/full?orderby=starttime&singleevents=true&start-min=2009-06-01&start-max=2009-07-31' | perl -lane [email protected]=$_=~m/<title type=.text.>(.+?)</g;@a=$_=~m/startTime=.(2009.+?)T/g;shift @m;for ($i=0;$i<@m;$i++){ print $m[$i].",".$a[$i];}';
2009-07-23 14:48:54
Functions: perl wget

substitute the URL with your private/public XML url from calendar sharing settings

substitute the dates YYYY-mm-dd

adjust the perl parsing part for your needs

wget <URL> -O- | wget -i -
wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
2009-07-02 01:46:21
User: bbelt16ag
Functions: wget

just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.

wget -q --user=<username> --password=<password> 'https://updates.opendns.com/nic/update?hostname=your_opendns_hostname&myip=your_ip' -O -
2009-06-22 18:08:42
User: Alanceil
Functions: wget

Intended for dynamic ip OpenDNS users, this command will update your OpenDNS network IP.

For getting your IP, you can use one of the many one-liners here on commandlinefu.


I use this in a script which is run by kppp after it has successfully connected to my ISP:



IP="`curl -s http://checkip.dyndns.org/ | grep -o '[[:digit:].]\+'`"


if [ "$IP" == "" ] ; then echo 'Not online.' ; exit 1


wget -q --user=topsecret --password="`echo $PW | xxd -ps -r`" 'https://updates.opendns.com/nic/update?hostname=myhostname&myip='"$IP" -O -

/etc/init.d/ntp-client restart &



PS: DynDNS should use a similar method, if you know the URL, please post a comment. (Something with members.dyndns.org, if I recall correctly)

wget $URL | htmldoc --webpage -f "$URL".pdf - ; xpdf "$URL".pdf &
wget -H -r -nv --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net --exclude-directories=
wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

wget -q -O - "$@" <url>
wget --server-response --spider http://www.example.com/
2009-03-31 18:49:14
User: penpen
Functions: wget

Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for

Spider mode enabled. Check if remote file exists.

--2009-03-31 20:42:46-- http://www.example.com/

Resolving www.example.com...

Connecting to www.example.com||:80... connected.

HTTP request sent, awaiting response...

and the second one looks for

Length: 438 [text/html]

Remote file exists and could contain further links,

but recursion is disabled -- not retrieving.

wget --http-user=YourUsername --http-password=YourPassword http://YourWebsiteUrl:2082/getbackup/backup-YourWebsiteUrl-`date +"%-m-%d-%Y"`.tar.gz
2009-03-31 17:50:41
User: nadavkav
Functions: wget

this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally.

(do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )

wget -c -t 1 --load-cookies ~/.cookies/rapidshare <URL>
2009-03-28 09:13:35
User: cammarin
Functions: wget

The download content part.

NOTE: the '-c' seems to not work very well and the download stuck at 99% sometimes. Just finish wget with no problem. Also, the download may restart after complete. You can also cancel. I don't know if it is a wget or Rapidshare glitch since I don't have problems with Megaupload, for example.

UPDATE: as pointed by roebek the restart glitch can be solved by the "-t 1" option. Thanks a lot.

wget --save-cookies ~/.cookies/rapidshare --post-data "login=USERNAME&password=PASSWORD" -O - https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi > /dev/null
2009-03-28 09:12:02
User: cammarin
Functions: wget

In order to do that, first you need to save a cookie file with your account info. These commands do it (maybe you need to create the '.cookies' dir before). Also, you need to check the "Direct downloads" option on the Premium Zone >> Settings tab.

You need to do this once (as long you maintain the file or your Rapidshare Premium account).

wget -qO- whatismyip.org
wget -q http://xyz.gpg -O- | sudo apt-key add -
2009-03-25 12:18:36
User: gnuyoga
Functions: sudo wget
Tags: wget apt-key

when we add a new package to a aptitude (the debian package manager) we need to add the gpg, otherwise it will show warning / error for missing key

wget -q http://xyz.gpg -O- | sudo apt-key add -
2009-03-25 12:18:29
User: gnuyoga
Functions: sudo wget
Tags: apt-key

when we add a new package to a aptitude (the debian package manager) we need to add the gpg, otherwise it will show warning / error for missing key

wget -r ftp://user:[email protected]
2009-03-09 19:39:30
User: movaxes
Functions: wget
Tags: wget

If the username includes an @ you can use this one:

wget -r --user=username_here --password=pass_here ftp://ftp.example.com

wget -S -O/dev/null "INSERT_URL_HERE" 2>&1 | grep Server
2009-03-09 06:54:54
User: asmoore82
Functions: grep wget

the good:

Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.4 with Suhosin-Patch

the bad:

Server: Microsoft-IIS/6.0

and the ugly:

Server: Apache/2.2.10 (Win32) mod_ssl/2.2.10 OpenSSL/0.9.8i PHP/5.2.6

wget -c -v -S -T 100 --tries=0 `curl -s http://ms1.espectador.com/ podcast/espectador/la_venganza_sera_terrible.xml | grep -v xml | grep link | sed 's/]*>//g'`
2009-03-04 13:12:28
User: fmdlc
Functions: grep link sed wget

This download a complete audio podcast

vlc --one-instance --playlist-enqueue -q $(while read netcast; do wget -q $netcast -O - |grep enclosure | tr '\r' '\n' | tr \' \" | sed -n 's/.*url="\([^"]*\)".*/\1/p'|head -n1; done <netcast.txt)
2009-03-03 04:26:01
User: tomwsmf
Functions: read sed tr wget

This is a quick line to stream in the latest offerings of your favorite netcasts/podcasts. You will need to have a file named netcast.txt in the directory you run this from. This file should have one and only one of your netcast's/podcst's url per line.

When run the line grabs the offering on the top of the netcast/podcast stack and end it over , quietly, to vlc.

Since I move around computers during the day I wanted an easy way to listen to my daily dose of news and such without having to worry about downloading to whatever machine I am on. This is just a quick grab and stream of whats current.

Future plans... have the list of netcasts be read from the web. possibly an rss or such. I use greader so there might be a way to use it as the source so as not to have to muck with multiple lists

alias wordpress='mkdir wordpress && cd wordpress && wget http://wordpress.org/latest.tar.gz && tar -xvzf latest.tar.gz && mv wordpress/* . && rm -rf latest.tar.gz wordpress && cp wp-config-sample.php wp-config.php'
wget -r -l1 -H -t1 -nd -N -np -A.mp3 -erobots=off -i ~/sourceurls.txt
2009-02-19 03:52:47
User: tomhiggins
Functions: wget

This was gotten from http://www.veen.com/jeff/archives/000573.html. The line will grab all the mp3 files on the urls listed in text file sourceurls.txt (one url per line) . A much more complete breakdown of the line can be found at the web site mentioned above.

wget -qO - http://infiltrated.net/blacklisted|awk '!/#|[a-z]/&&/./{print "iptables -A INPUT -s "$1" -j DROP"}'
2009-02-18 16:08:23
User: sil
Functions: wget

Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It's a more practical solution to block all and allow in specifics however, there are many who don't or can't do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print "add deny ip from "$1" to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules