Commands using wget (286)

  • Like the original command, but the -f allows this one to succeed even if the website returns uncompressed data. From gzip(1) on the -f flag: If the input data is not in a format recognized by gzip, and if the --stdout is also given, copy the input data without change to the standard output: let zcat behave as cat.


    1
    wget -q -O- --header="Accept-Encoding: gzip" <url> | gzip -cdf > out.html
    tempusername · 2014-11-29 20:42:21 8
  • Let's give Flatcap credit for this elegant solution, instead of leaving it hidden as a comment. Tested on RHEL6 and it works. Nice and clean.


    1
    curl -s https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/ | grep -o '[^"]*Linux/7/pdf[^"]*' | xargs -I{} wget https://access.redhat.com{}
    SuperFly · 2015-05-11 11:57:20 11

  • 1
    wget -O - -o /dev/null -q --user=$user --password=$pass "http://$ip/ADV_home2.htm" | awk -r '/Internet Port/, /Domain/ {if ($0 ~ /([[:digit:]]+\.){3}[[:digit:]]+/ && ($3 !~ /^>(0|255)/)) {match($3, /([[:digit:]]+\.){3}[[:digit:]]+/, ar); print ar[0]; }}'
    phranz · 2015-07-09 22:55:57 9
  • This will download and install the latest version of the open store on the ubuntu phone, this store includes unconfined applications such as the TweakGeek and the Ubuntu Touch Tweak Tool. You can see the install instructions from here: https://open.uappexplorer.com/docs#install Show Sample Output


    1
    wget https://open.uappexplorer.com/api/download/openstore.openstore-team/openstore.*_*_armhf.click && pkcon install-local --allow-untrusted openstore.*_*_armhf.click
    bugmenot · 2016-02-04 14:24:46 16

  • 1
    wget --quiet 'https://raw.githubusercontent.com/rahult/books/master/well_grounded_rubyist/threads/rps.rb' - | ruby -c
    swarzynski · 2016-02-18 11:14:55 14
  • Neither of the others worked for me. This does.


    1
    curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
    dakira · 2016-05-29 12:07:21 21
  • Download latest released gitlab docker container


    1
    wget -qO- 'https://github.com'$(curl -s 'https://github.com'$(curl -s https://github.com/sameersbn/docker-gitlab/releases | grep -m 1 -o '<a.*[0-9\.]</a>' | cut -d '"' -f 2) | grep -o '<a.* rel="nofollow">' | grep 'tar.gz' | cut -d '"' -f 2)
    BigZ · 2016-08-23 21:36:57 14

  • 1
    cat url.list | parallel -j 8 wget -O {#}.html {}
    arthurwayne · 2018-12-22 08:14:06 33

  • 0
    wget -H -r -nv --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net --exclude-directories=
    bbelt16ag · 2009-05-18 18:05:19 4
  • substitute the URL with your private/public XML url from calendar sharing settings substitute the dates YYYY-mm-dd adjust the perl parsing part for your needs Show Sample Output


    0
    wget -q -O - 'URL/full?orderby=starttime&singleevents=true&start-min=2009-06-01&start-max=2009-07-31' | perl -lane '@m=$_=~m/<title type=.text.>(.+?)</g;@a=$_=~m/startTime=.(2009.+?)T/g;shift @m;for ($i=0;$i<@m;$i++){ print $m[$i].",".$a[$i];}';
    unixmonkey4704 · 2009-07-23 14:48:54 4
  • This lengthy cryptic line will print the latest top 10 commandlinefu.com posts without their summaries. To print also their respective summaries use the following (even bigger) command line: wget -qO - http://www.commandlinefu.com/feed/tenup | xmlstarlet sel -T -t -o '<doc>' -n -t -m rss/channel/item -o '<item>' -n -o '<title>' -v title -o '</title>' -n -o '<description>' -v description -o '</description>' -n -o '</item>' -n -t -o '</doc>' | xmlstarlet sel -T -t -m doc/item -v description/code -n -v title -n -n It is recommended to include this line into a shell script to be easily run, as I do myself. You could also use the following URLs to browse the top 3 commands: wget -qO - http://www.commandlinefu.com/feed/threeup | xmlstarlet ... .. or all others: wget -qO - http://feeds2.feedburner.com/Command-line-fu | xmlstarlet ... PS: You need to install "xmlstarlet" to run it. It is found in Debian APT repositories (apt-get install xmlstarlet) or under the http://xmlstar.sourceforge.net/ URL. Show Sample Output


    0
    wget -qO - http://www.commandlinefu.com/feed/tenup | xmlstarlet sel -T -t -o '&lt;x&gt;' -n -t -m rss/channel/item -o '&lt;y&gt;' -n -v description -o '&lt;/y&gt;' -n -t -o '&lt;/x&gt;' | xmlstarlet sel -T -t -m x/y -v code -n
    fsilveira · 2009-08-14 02:44:00 3
  • Can be used to help perform some SEO optimizations. Show Sample Output


    0
    wget -q -O- PAGE_URL | grep -o 'WORD_OR_STRING' | wc -w
    evalinux · 2009-08-17 13:08:46 4
  • This will download all the phracks! Enjoy!


    0
    for ((i=1; i<67; i++)) do wget http://www.phrack.org/archives/tgz/phrack${i}.tar.gz -q; done
    Abiden · 2009-08-20 23:27:01 6
  • I don't know if the --spider option works to execute a script, but it might be worth trying. Note that the Drupal project uses the following in a cron job. wget -O - -q http://localhost/drupal/cron.php The output is sent to standard out so it can be logged by cron.


    0
    wget -q --spider http://server/cgi/script
    ashawley · 2009-09-11 05:33:48 3

  • 0
    wget -qO - http://www.sputnick-area.net/ip;echo
    cfajohnson · 2009-11-20 23:10:31 4
  • ABBA would be more entertaining if they sang this.


    0
    wget -O - -q http://www.azlyrics.com/lyrics/abba/takeachanceonme.html | sed -e 's/[cC]hance/dump/g' > ~/tdom.htm && firefox ~/tdom.htm
    tighe · 2009-12-04 22:56:00 5
  • Only need to install Image Magick package. Display a xkcd comic with its title and save it in /tmp directory If you prefer to view the newest xkcd, use this command: wget -q http://xkcd.com/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'


    0
    wget -q http://dynamic.xkcd.com/comic/random/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'
    laugg · 2009-12-09 13:41:25 7
  • This is a minimalistic version of the ubiquitious Google definition screen scraper. This version was designed not only to run fast, but to work using BusyBox. BusyBox is a collection of basic Unix tools that have been compiled into a single binary to save space on tiny installations of Unix. For example, although my phone doesn't have perl or the GNU utilities, it does have BusyBox's stripped down versions of wget, tr, and sed. It turns out that those tools suffice for many tasks. Known Bugs: This script does not handle HTML entities at all. I don't think there's an easy way to do that within BusyBox, but I'd love to see it if someone could do it. Also, this script can only define a single word, not phrases. (Well, you could if you typed in %20, but that'd be gross.) Lastly, this script does not show the URL where definitions were found. Given the randomness of the Net, that last bit of information is often key. Show Sample Output


    0
    wget -q -U busybox -O- "http://www.google.com/search?ie=UTF8&q=define%3A$1" | tr '<' '\n' | sed -n 's/^li>\(.*\)/\1\n/p'
    hackerb9 · 2010-02-01 13:01:47 9
  • This is a simple command that you can run complex shell scripts via ssh. For instance if you would have to run the same process on several hundred hosts. There is no security so you have to trust the server that is sourcing this script.


    0
    wget -qO - sometrusted.web.site/tmp/somecommand | sh
    UnixSage · 2010-06-01 01:25:21 3
  • other options: * replace md5sum with sha1sum for SHA1 checksum * replace '>' with '| tar zx' for extracting tarball Show Sample Output


    0
    wget -qO - http://www.google.com | tee >(md5sum) > /tmp/index.html
    jianingy · 2010-07-23 06:29:29 4
  • This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic. To get a random xkcd comic use the following: xkcdrandom() { wget -qO- http://dynamic.xkcd.com/comic/random | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash; } These are just a bit shorter than the ones eigthmillion wrote, however his version didn't work as expected on my laptop for some reason (I got the title-tag first), so these build a command which is executed by bash.


    0
    xkcd() { wget -qO- http://xkcd.com/ | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash ; }
    John_W · 2010-08-25 15:44:31 6

  • 0
    wget -q $(lynx --dump 'http://geekandpoke.typepad.com/' | grep '\/.a\/' | grep '\-pi' | head -n 1 | awk '{print $2}') -O geekandpoke.jpg
    tersmitten · 2010-09-07 12:15:36 3
  • Grabs the ip2location site and removes everything but the span tag containing the country value. Place it inside your .bashrc or .bash_aliases file. Show Sample Output


    0
    ip2loc() { wget -qO - www.ip2location.com/$1 | grep "<span id=\"dgLookup__ctl2_lblICountry\">" | sed 's/<[^>]*>//g; s/^[\t]*//; s/&quot;/"/g; s/</</g; s/>/>/g; s/&amp;/\&/g'; }
    bkuri · 2010-10-13 00:19:35 4
  • Watch a video while it's downloading. It's additionally saved to the disk for later viewing.


    0
    wget `youtube-dl -g 'http://www.youtube.com/watch?v=-S3O9qi2E2U'` -O - | tee -a parachute-ending.flv | mplayer -cache 8192 -
    artagnon · 2010-10-28 13:51:59 3

  • 0
    cd /usr/src ; wget http://www.rarlab.com/rar/unrarsrc-4.0.2.tar.gz ; tar xvfz unrarsrc-4.0.2.tar.gz ; cd unrar ; ln -s makefile.unix Makefile ; make clean ; make ; make install
    yababay · 2010-12-09 10:35:28 6
  • ‹ First  < 5 6 7 8 9 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Slugify: converts strings in any language into Slugs (friendly names to use in URLs and filenames)
Slug the part of an URL which identifies a page using human-readable keywords. Slugs are used to construct friendly URLs (often for permalinks) that are easy to type, descriptive, and easy to remember.

"at" command w/o the resource usage/competition issues
EXAMPLES jb "next sun 12pm" "/bin/sh ~you/1.sh" & jb "2010-08-29 12:00:00" "~you/1.sh" & jb "29aug2010 gmt" ". ~you/1.sh" & jb 12:00p.m. "nohup ./1.sh" & jb 1min "echo stop!" & SEE ALSO parsedate(3) strftime(3)

Install pip with Proxy
Installs pip packages defining a proxy

Create a mirror of a local folder, on a remote server
Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22) (all files & folders on destination server/folder will be deleted)

pretend to be busy in office to enjoy a cup of coffee
Create a progress dialog with custom title and text using zenity.

Copy a file from a remote server to your local box using on-the-fly compression
-P displays a progress meter -z tells rsync to use compression

Set laptop display brightness
Run as root. Path may vary depending on laptop model and video card (this was tested on an Acer laptop with ATI HD3200 video). $ cat /proc/acpi/video/VGA/LCD/brightness to discover the possible values for your display.

Find usb device
I often use it to find recently added ou removed device, or using find in /dev, or anything similar. Just run the command, plug the device, and wait to see him and only him

Use acpi and notify-send to report current temperature every five minutes.
Use acpi and notify-send to report current temperature every five minutes. Works best in a shell script run at startup. acpi is called for temperature and fed to notify-send for a tooltip. After waiting five minutes, it will start over.

mysql bin log events per minute
shows number of mysql bin log events (which are mysql server events) per minute, useful to check stress times postmortem


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: