Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using wget from sorted by
Terminal - Commands using wget - 248 results
wget --auth-no-challenge --server-response -O- $url 2>&1 | grep "Cookie" | sed "s/^ Set-//g" > cookie.txt; wget --auth-no-challenge --server-response --http-user="user" --http-password="pw" --header="$(cat cookie.txt)" -O- $url
2010-12-01 11:24:35
User: glaudiston
Functions: grep sed wget
1

I have a server with a php requiring basic authentication, like this:

header('WWW-Authenticate: Basic realm="do auth"');

header('HTTP/1.0 401 Unauthorized');

...?>

And the basic authentication in wget do not worked:

wget --auth-no-challenge --http-user="username" --http-password="password" -O- "http://url" wget --keep-session-cookies --save-cookies=cookies.txt --load-cookies=cokies.txt --http-user="username" --http-password="password" -O- "http://url"

I always received the 401 Authorization failed.

The saved cookie is always empty.

With my way, I received the header from the server and save the cookie, then resend the session cookie with authentication data

cat video.ogg | nc -l -p 4232 & wget http://users.bshellz.net/~bazza/?nombre=name -O - & sleep 10; mplayer http://users.bshellz.net/~bazza/datos/name.ogg
wget -q -O- --header\="Accept-Encoding: gzip" <url> | gunzip > out.html
2010-11-27 22:14:42
User: ashish_0x90
Functions: gunzip wget
2

Get gzip compressed web page using wget.

Caution: The command will fail in case website doesn't return gzip encoded content, though most of thw websites have gzip support now a days.

sudo apt-add-repository 'deb http://archive.offensive-security.com pwnsauce main microverse macroverse restricted universe multiverse' && wget -q http://archive.offensive-security.com/backtrack.gpg -O- | sudo apt-key add -
2010-11-16 18:23:48
User: kzh
Functions: sudo wget
Tags: Debian
3

Add the BackTrack repositories to your Debian based GNU/Linux distribution. Thanks to http://it-john.com/home/technology/linux-technology/add-back-track-4-repo-to-ubuntu/

wget `youtube-dl -g 'http://www.youtube.com/watch?v=-S3O9qi2E2U'` -O - | tee -a parachute-ending.flv | mplayer -cache 8192 -
2010-10-28 13:51:59
User: artagnon
Functions: tee wget
0

Watch a video while it's downloading. It's additionally saved to the disk for later viewing.

wget -O xkcd_$(date +%y-%m-%d).png `lynx --dump http://xkcd.com/|grep png`; eog xkcd_$(date +%y-%m-%d).png
diff <(wget -q -O - URL1) <(wget -q -O - URL2)
svn co http://simile.mit.edu/repository/crowbar/trunk&& cd ./trunk/xulapp/ xulrunner --install-app && Xvfb :1 && DISPLAY=:1 xulrunner application.ini 2>/dev/null 1>/dev/null && wget -O- "127.0.0.1:10000/&url=http://www.facebook.com"
2010-10-16 05:12:11
User: argv
Functions: cd wget
-1

some other options:

&delay=1000

&mode=links

much more with piggybank as scraper

works well with your favourite curses or non-curses http clients

ip2loc() { wget -qO - www.ip2location.com/$1 | grep "<span id=\"dgLookup__ctl2_lblICountry\">" | sed 's/<[^>]*>//g; s/^[\t]*//; s/&quot;/"/g; s/</</g; s/>/>/g; s/&amp;/\&/g'; }
2010-10-13 00:19:35
User: bkuri
Functions: grep sed wget
0

Grabs the ip2location site and removes everything but the span tag containing the country value. Place it inside your .bashrc or .bash_aliases file.

wget -qO - http://i18n.counter.li.org/ | grep 'users registered' | sed 's/.*\<font size=7\>//g' | tr '\>' ' ' | sed 's/<br.*//g' | tr ' ' '\0'
wget -O - 'https://USERNAMEHERE:PASSWORDHERE@mail.google.com/mail/feed/atom' --no-check-certificate
2010-09-26 14:47:13
User: PLA
Functions: wget
-3

I use this command in my Conky script to display the number of messages in my Gmail inbox and to list the from: and subject: fields.

wget -q $(lynx --dump 'http://geekandpoke.typepad.com/' | grep '\/.a\/' | grep '\-pi' | head -n 1 | awk '{print $2}') -O geekandpoke.jpg
wget -O- -q http://www.nomachine.com/download-package.php?Prod_Id=2067 | sed -n -e 'H;${x;s/\n/ /g;p;}' | sed -e "s/[Hh][Rr][Ee][Ff]=\"/\n/g" | cut -d "\"" -f1 | sort -u | grep deb$
wget -k -r -l 5 http://gentoo-install.com
2010-09-03 01:42:50
User: fecub
Functions: wget
1

Download Websites to 5 Level and browse offline!

-k -> convert-links (to browse offline)

-r -> recursive download

-l 5 -> level 5

example.

http://gentoo-install.com

:-)

translate() { echo $1: $(wget -q -O - 'http://www.google.de/dictionary?source=translation&q='$1'&langpair=en|de' | grep '^<span class="dct-tt">.*</span>$' | sed 's!<span class="dct-tt">\(.*\)</span>!\1, !'); }
2010-09-02 00:08:06
User: fpunktk
Functions: echo grep sed wget
1

the google-api gives you only one translation which is sometimes insufficent. this function gives you all translations, so you can choose which one fits best.

wget http://forums.dropbox.com && wget $(cat index.html|grep "Latest Forum Build"|cut -d"\"" -f2) && wget $(cat topic.php*|grep "Linux x86:"|cut -d"\"" -f2|sort -r|head -n1) && rm -rf ~/.dropbox* && rm index.html *.php* && tar zxvf dropbox-*.tar.gz -C ~/
xkcd() { wget -qO- http://xkcd.com/ | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash ; }
2010-08-25 15:44:31
User: John_W
Functions: bash sed wget
0

This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic.

To get a random xkcd comic use the following:

xkcdrandom() { wget -qO- http://dynamic.xkcd.com/comic/random | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash; }

These are just a bit shorter than the ones eigthmillion wrote, however his version didn't work as expected on my laptop for some reason (I got the title-tag first), so these build a command which is executed by bash.

wget -k $URL
2010-08-21 17:39:53
User: minnmass
Functions: wget
Tags: wget
-2

The "-k" flag will tell wget to convert links for local browsing; it works with mirroring (ie with "-r") or single-file downloads.

wget -qO - www.commandlinefu.com/commands/random | grep "<div class=\"command\">" | sed 's/<[^>]*>//g; s/^[ \t]*//; s/&quot;/"/g; s/&lt;/</g; s/&gt;/>/g; s/&amp;/\&/g'
2010-08-12 23:58:24
User: smop
Functions: grep sed wget
Tags: random
1

retrieves the html from a random command line fu page, then finds commands on the page and prints them

alternatively, pipe to bash (add "| bash" to the end) to execute the command (very risky)

edit: had to adjust to properly display the portion that replaces HTML characters (e.g. &quot; -> ")

wget -qO- ifconfig.me/ip
2010-08-05 12:04:43
User: glaudiston
Functions: wget
Tags: ip address
3

alternative to

curl ifconfig.me

for those that don't have curl

wget --quiet -O - checkip.dyndns.org | sed -e 's/[^:]*: //' -e 's/<.*$//'
2010-08-01 13:36:08
User: berkes
Functions: sed wget
Tags: ip address
-1

Wgets "whatismyip" from checkip.dyndns.org and filters out the actual IP-adress. Usefull when you quickly need to find the outward facting IP-address of your current location.

wget http://www.whatismyip.org --quiet -O - | cat
wget -qO - http://www.google.com | tee >(md5sum) > /tmp/index.html
2010-07-23 06:29:29
User: jianingy
Functions: tee wget
0

other options:

* replace md5sum with sha1sum for SHA1 checksum

* replace '>' with '| tar zx' for extracting tarball

wget --load-cookies <cookie-file> -c -i <list-of-urls>
wget randomfunfacts.com -O - 2>/dev/null|grep \<strong\>|sed "s;^.*<i>\(.*\)</i>.*$;\1;"|cowsay -f tux