Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using wget from sorted by
Terminal - Commands using wget - 237 results
wget -qO - snubster.com|sed -n '65p'|awk 'gsub(/<span><br>.*/,"")&&1'|perl -p -e 's:myScroller1.addItem\("<span class=atHeaderOrange>::g;s:</span> <span class=snubFontSmall>::g;s:&quot;:":g;s:^:\n:g;s:$:\n:'
2009-02-18 15:05:13
User: sil
Functions: wget
-2

I've got this posted in one of my .bash_profiles for humor whenever I log in.

wget --random-wait -r -p -e robots=off -U mozilla http://www.example.com
2009-02-17 22:53:07
User: starchox
Functions: wget
150

-p parameter tells wget to include all files, including images.

-e robots=off you don't want wget to obey by the robots.txt file

-U mozilla as your browsers identity.

--random-wait to let wget chose a random number of seconds to wait, avoid get into black list.

Other Useful wget Parameters:

--limit-rate=20k limits the rate at which it downloads files.

-b continues wget after logging out.

-o $HOME/wget_log.txt logs the output

wget --recursive --page-requisites --convert-links www.moyagraphix.co.za
2009-02-16 10:57:44
User: RonaldConco
Functions: wget
16

Very nice command when you want to download a site locally to your machine, including images, css and javascript

wget http://search.twitter.com/trends.json -O - --quiet | ruby -rubygems -e 'require "json";require "yaml"; puts YAML.dump(JSON.parse($stdin.gets))'
wget -p --convert-links http://www.foo.com
2009-02-13 19:02:08
User: damniel
Functions: wget
8

The example will create a directory called "

Caveats: @imports of css files will not be converted.

wget -qO - "http://www.tarball.com/tarball.gz" | tar zxvf -
wget -qO - http://myip.dk/ | egrep -m1 -o '[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}'
seq -f"ftp://ftp.vim.org/pub/vim/patches/7.1/7.1.%03g" 176 240 | xargs -I {} wget -c {};
2009-02-06 03:19:06
User: liupeng
Functions: seq wget xargs
3

Seq allows you to define printf like formating by specified with -f, %03g is actually tells seq I got three digits, fill the blank digits with 0, and the range is from 176 to 240.

echo -n "search> ";read QUERY && wget -O - `wget -O - -U "Mozilla/5.0" "http://images.google.com/images?q=${QUERY}" 2>/dev/null |sed -e 's/","http/\n","http/g' |awk -F \" '{print $3}' |grep -i http: |head -1` > "$QUERY"
2009-02-05 19:50:53
User: wwest4
Functions: echo wget
2

prompts for a search term and then pulls down the first result from google images

wget -O - http://www.commandlinefu.com/commands/browse/rss 2>/dev/null | awk '/\s*<title/ {z=match($0, /CDATA\[([^\]]*)\]/, b);print b[1]} /\s*<description/ {c=match($0, /code>(.*)<\/code>/, d);print d[1]"\n"} '
URL=www.example.com && wget -rq --spider --force-html "http://$URL" && find $URL -type d > url-list.txt && rm -rf $URL
2009-01-27 17:59:08
User: root
Functions: find rm wget
1

This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.

wget -r -l1 --no-parent -nH -nd -P/tmp -A".gif,.jpg" http://example.com/images
2009-01-27 17:31:22
User: root
Functions: wget
33

This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.