What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands using wget from sorted by
Terminal - Commands using wget - 256 results
wget -k $URL
2010-08-21 17:39:53
User: minnmass
Functions: wget
Tags: wget

The "-k" flag will tell wget to convert links for local browsing; it works with mirroring (ie with "-r") or single-file downloads.

wget -qO - www.commandlinefu.com/commands/random | grep "<div class=\"command\">" | sed 's/<[^>]*>//g; s/^[ \t]*//; s/&quot;/"/g; s/&lt;/</g; s/&gt;/>/g; s/&amp;/\&/g'
2010-08-12 23:58:24
User: smop
Functions: grep sed wget
Tags: random

retrieves the html from a random command line fu page, then finds commands on the page and prints them

alternatively, pipe to bash (add "| bash" to the end) to execute the command (very risky)

edit: had to adjust to properly display the portion that replaces HTML characters (e.g. &quot; -> ")

wget -qO- ifconfig.me/ip
2010-08-05 12:04:43
User: glaudiston
Functions: wget
Tags: ip address

alternative to

curl ifconfig.me

for those that don't have curl

wget --quiet -O - checkip.dyndns.org | sed -e 's/[^:]*: //' -e 's/<.*$//'
2010-08-01 13:36:08
User: berkes
Functions: sed wget
Tags: ip address

Wgets "whatismyip" from checkip.dyndns.org and filters out the actual IP-adress. Usefull when you quickly need to find the outward facting IP-address of your current location.

wget http://www.whatismyip.org --quiet -O - | cat
wget -qO - http://www.google.com | tee >(md5sum) > /tmp/index.html
2010-07-23 06:29:29
User: jianingy
Functions: tee wget

other options:

* replace md5sum with sha1sum for SHA1 checksum

* replace '>' with '| tar zx' for extracting tarball

wget --load-cookies <cookie-file> -c -i <list-of-urls>
wget randomfunfacts.com -O - 2>/dev/null|grep \<strong\>|sed "s;^.*<i>\(.*\)</i>.*$;\1;"|cowsay -f tux
wget -O - http://www.commandlinefu.com/commands/browse/rss 2>/dev/null | awk '/\s*<title/ {z=match($0, /CDATA\[([^\]]*)\]/, b);print b[1]} /\s*<description/ {c=match($0, /code>(.*)<\/code>/, d);print d[1]} ' | grep -v "^$"
2010-06-29 16:22:03
User: nikunj
Functions: awk grep wget
Tags: awk grep meta

A Quick variation to the latest commands list with the new-lines skipped. This is faster to read.

wget -qO- icanhazip.com
2010-06-24 03:49:14
Functions: wget

Curl is not installed by default on many common distros anymore. wget always is :)

wget -qO- ifconfig.me/ip
wget -qO - "http://ajax.googleapis.com/ajax/services/language/translate?langpair=|zh-cn&v=1.0&q=`xsel`" |cut -d \" -f 6
wget -qO - "http://ajax.googleapis.com/ajax/services/language/translate?langpair=|en&v=1.0&q=`xsel`" |cut -d \" -f 6
2010-06-04 17:20:17
User: fain182
Functions: cut wget

Uses google api to translate, you can modify the language in which translate modifying the parameter "langpair=|en", the format is language input|language output.

wget -qO - sometrusted.web.site/tmp/somecommand | sh
2010-06-01 01:25:21
User: UnixSage
Functions: wget

This is a simple command that you can run complex shell scripts via ssh. For instance if you would have to run the same process on several hundred hosts. There is no security so you have to trust the server that is sourcing this script.

cd /some/empty/folder/website_diffs/sitename && wget -N http://domain.com/ 2>&1 |grep -q "o newer" || printf "Sites web page appears to have updated.\n\nSuggest you check it out.\n\n"|mail -s "Sites page updated." david@email.com
2010-05-09 07:28:42
User: DaveQB
Functions: cd grep mail printf wget

A cronjob command line to email someone when a webpages homepage is updated.

translate() { lng1="$1";lng2="$2";shift;shift; wget -qO- "http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=${@// /+}&langpair=$lng1|$lng2" | sed 's/.*"translatedText":"\([^"]*\)".*}/\1\n/'; }
kaffeine $(wget -qO- "http://questions-pour-un-champion.france3.fr/emission/index-fr.php?page=video&type_video=quotidiennes&video_courante=$(date +%Y%m%d)" | grep -o "mms.*wmv" | uniq)
2010-04-29 17:59:06
User: fbone
Functions: grep wget

kaffeine could be replaced by any player able to read mms stream

mv ubuntu-10.04-rc-desktop-amd64.iso ubuntu-10.04-desktop-amd64.iso; i=http://releases.ubuntu.com/10.04/ubuntu-10.04-desktop-amd64.iso.zsync; while true; do if wget $i; then zsync $i; date; break; else sleep 30; fi; done
2010-04-29 15:49:43
Functions: mv sleep wget

Need to have rc iso pre-downloaded before running command.

while true; do if wget http://releases.ubuntu.com/10.04/ubuntu-10.04-desktop-i386.iso.torrent; then ktorrent --silent ubuntu-10.04-desktop-i386.iso.torrent ; date; break; else sleep 5m; fi; done
2010-04-29 13:22:54
User: ppaschka
Functions: sleep wget

Tested with 9.10 release. Choose whatever torrent client you prefer.

wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" | while read FUNFACT; do notify-send -t $((1000+300*`echo -n $FUNFACT | wc -w`)) -i gtk-dialog-info "RandomFunFact" "$FUNFACT"; done
2010-04-02 09:43:32
User: mtron
Functions: grep read sed wc wget

extension to tali713's random fact generator. It takes the output & sends it to notify-osd. Display time is proportional to the lengh of the fact.

wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
2010-03-30 23:49:30
User: tali713
Functions: grep sed wget

Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point.

This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact.

If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read.

wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"

The file lastfact, can then be examined whenever the command fails.

trickle -d 60 wget http://very.big/file
2010-03-29 06:55:30
Functions: wget

Trickle is a voluntary, cooperative bandwidth shaper. it works entirely in userland and is very easy to use.

The most simple application is to limit the bandwidth usage of programs.

wget -r -np -nd -A.pdf --user *** --password *** http://www.domain.tld/courses/***/download/
if [ "$(ping -q -c1 google.com)" ];then wget -mnd -q http://www.google.com/intl/en_ALL/images/logo.gif ;fi
2010-03-23 04:15:03
User: alf
Functions: wget

Bash scrip to test if a server is up, you can use this before wget'ing a file to make sure a blank one isn't downloaded.

currency_convert() { wget -qO- "http://www.google.com/finance/converter?a=$1&from=$2&to=$3&hl=es" | sed '/res/!d;s/<[^>]*>//g'; }
for i in be bg cz de es fi fr hu it lv lu at pl pt ro sk si ; do echo -n "$i " ; wget -q -O - http://www.expansys.$i/d.aspx?i=196165 | grep price | sed "s/.*<p id='price'><strong>&euro; \([0-9]*[,.][0-9]*\).*/\1/g"; done
2010-03-18 15:13:20
User: betsubetsu
Functions: at bg echo grep sed wget

You think Expansys in all these countries will sell the HTC Desire for the same price? Well, you'll be surprised. Most of them will be sold at 499.99 EUR but the cheapest can be found in Germany and the most expensive, in Belgium.