Commands tagged wget (102)

  • This will download and install the latest version of the open store on the ubuntu phone, this store includes unconfined applications such as the TweakGeek and the Ubuntu Touch Tweak Tool. You can see the install instructions from here: https://open.uappexplorer.com/docs#install Show Sample Output


    1
    wget https://open.uappexplorer.com/api/download/openstore.openstore-team/openstore.*_*_armhf.click && pkcon install-local --allow-untrusted openstore.*_*_armhf.click
    bugmenot · 2016-02-04 14:24:46 16
  • Neither of the others worked for me. This does.


    1
    curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
    dakira · 2016-05-29 12:07:21 21
  • Download latest released gitlab docker container


    1
    wget -qO- 'https://github.com'$(curl -s 'https://github.com'$(curl -s https://github.com/sameersbn/docker-gitlab/releases | grep -m 1 -o '<a.*[0-9\.]</a>' | cut -d '"' -f 2) | grep -o '<a.* rel="nofollow">' | grep 'tar.gz' | cut -d '"' -f 2)
    BigZ · 2016-08-23 21:36:57 14

  • 1
    cat url.list | parallel -j 8 wget -O {#}.html {}
    arthurwayne · 2018-12-22 08:14:06 33
  • Directly download all mp3 files of the desired podcast


    1
    curl http://radiofrance-podcast.net/podcast09/rss_14726.xml | grep -Eo "(http|https)://[a-zA-Z0-9./?=_%:-]*mp3" | sort -u | xargs wget
    pascalvaucheret · 2021-08-09 13:40:26 174
  • substitute the URL with your private/public XML url from calendar sharing settings substitute the dates YYYY-mm-dd adjust the perl parsing part for your needs Show Sample Output


    0
    wget -q -O - 'URL/full?orderby=starttime&singleevents=true&start-min=2009-06-01&start-max=2009-07-31' | perl -lane '@m=$_=~m/<title type=.text.>(.+?)</g;@a=$_=~m/startTime=.(2009.+?)T/g;shift @m;for ($i=0;$i<@m;$i++){ print $m[$i].",".$a[$i];}';
    unixmonkey4704 · 2009-07-23 14:48:54 4
  • I don't know if the --spider option works to execute a script, but it might be worth trying. Note that the Drupal project uses the following in a cron job. wget -O - -q http://localhost/drupal/cron.php The output is sent to standard out so it can be logged by cron.


    0
    wget -q --spider http://server/cgi/script
    ashawley · 2009-09-11 05:33:48 3
  • Mostly for Norwegians, but easily adoptable to others. Very handy if you are brainstorming for a new domainname. Will only display the available ones.. You can usually do this better with dig, but if you dont have dig, or the TLD only have an online service to check with, this will be usefull.. Show Sample Output


    0
    check_dns_no() { for i in $* ; do if `wget -O - -q http://www.norid.no/domenenavnbaser/whois/?query=$i.no | grep "no match" &>/dev/null` ; then echo $i.no "available" ; fi ; sleep 1 ;done }
    xeor · 2009-09-30 21:17:33 8
  • Only need to install Image Magick package. Display a xkcd comic with its title and save it in /tmp directory If you prefer to view the newest xkcd, use this command: wget -q http://xkcd.com/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'


    0
    wget -q http://dynamic.xkcd.com/comic/random/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'
    laugg · 2009-12-09 13:41:25 7
  • This is a minimalistic version of the ubiquitious Google definition screen scraper. This version was designed not only to run fast, but to work using BusyBox. BusyBox is a collection of basic Unix tools that have been compiled into a single binary to save space on tiny installations of Unix. For example, although my phone doesn't have perl or the GNU utilities, it does have BusyBox's stripped down versions of wget, tr, and sed. It turns out that those tools suffice for many tasks. Known Bugs: This script does not handle HTML entities at all. I don't think there's an easy way to do that within BusyBox, but I'd love to see it if someone could do it. Also, this script can only define a single word, not phrases. (Well, you could if you typed in %20, but that'd be gross.) Lastly, this script does not show the URL where definitions were found. Given the randomness of the Net, that last bit of information is often key. Show Sample Output


    0
    wget -q -U busybox -O- "http://www.google.com/search?ie=UTF8&q=define%3A$1" | tr '<' '\n' | sed -n 's/^li>\(.*\)/\1\n/p'
    hackerb9 · 2010-02-01 13:01:47 9
  • yt2mp3(){ for j in `seq 1 301`;do i=`curl -s gdata.youtube.com/feeds/api/users/$1/uploads\?start-index=$j\&max-results=1|grep -o "watch[^&]*"`;ffmpeg -i `wget youtube.com/$i -qO-|grep -o 'url_map"[^,]*'|sed -n '1{s_.*|__;s_\\\__g;p}'` -vn -ab 128k "`youtube-dl -e ${i#*=}`.mp3";done;} squeezed the monster (and nifty ☺) command from 7776 from 531 characters to 284 characters, but I don't see a way to get it down to 255. This is definitely a kludge!


    0
    Command in description (Your command is too long - please keep it to less than 255 characters)
    __ · 2011-02-03 08:25:42 5
  • On a machine behind a firewall, it's possible to pass the proxy server address in as a prefix to wget to avoid having to set it as an environment variable first.


    0
    http_proxy=<proxy.server:port> wget <url>
    rdc · 2011-03-30 13:06:19 3
  • This will visit recursively all linked urls starting from the specified URL. It won't save anything locally and it will produce a detailed log. Useful to find broken links in your site. It ignores robots.txt, so just use it on a site you own!


    0
    wget --spider -o wget.log -e robots=off --wait 1 -r -p http://www.example.com
    lele · 2011-04-05 13:42:14 4

  • 0
    for i in `seq -w 1 50`; do wget --continue \ http://commandline.org.uk/images/posts/animal/$i.jpg; done
    totti · 2011-08-19 20:06:16 5

  • 0
    curl -sm1 http://www.website.com/ | grep -o 'http://[^"]*jpg' | sort -u | wget -qT1 -i-
    kev · 2011-09-10 19:21:13 4
  • Recursively download all files of a certain type down to two levels, ignoring directory structure and local duplicates. Usage: wgetall mp3 http://example.com/download/


    0
    wgetall () { wget -r -l2 -nd -Nc -A.$@ $@ }
    peterRepeater · 2011-09-28 09:43:25 3
  • This shell function uses wget(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from: expandurl http://t.co/LDWqmtDM Show Sample Output


    0
    expandurl() { wget -S $1 2>&1 | grep ^Location; }
    atoponce · 2011-10-18 18:50:54 11
  • Gets the IP and sticks it into the middle-mouse-click buffer


    0
    echo -n $(curl -Ss http://icanhazip.com) | xclip
    red_five · 2012-02-17 16:58:40 3
  • Tries to avoid the fragile nature of scrapers by looking for user-input in the output as opposed to markup or headers on the web site. Show Sample Output


    0
    function ip-where { wget -qO- -U Mozilla http://www.ip-adress.com/ip_tracer/$1 | html2text -nobs -style pretty | sed -n /^$1/,/^$/p;}
    tox2ik · 2012-10-22 21:39:53 5
  • Updated to the new version of the MW webpage (seems MW does not use cougar anymore, so the other commands do not work nowadays), and using Xidel to parse the page with a html parser instead regex. Example usage: pronounce onomatopoetic I'm not sure how well Xidel works with binary streams (although it seems to work great in tests), so using wget to download the actual wav file might be safer, i.e.: pronounce(){ wget -qO- $(xidel "http://www.m-w.com/dictionary/$*" -f "replace(css('.au')[1]/@onclick,\".*'([^']+)', *'([^']+)'.*\", '/audio.php?file=\$1&word=\$2')" -e 'css("embed")[1]/@src') | aplay -q;} Xidel is not a standard cli tool and has to be downloaded from xidel.sourceforge.net


    0
    pronounce(){ xidel "http://www.m-w.com/dictionary/$*" -f "replace(css('.au')[1]/@onclick,\".*'([^']+)', *'([^']+)'.*\", '/audio.php?file=\$1&word=\$2')" -f 'css("embed")[1]/@src' --download - | aplay -q;}
    BeniBela · 2013-04-18 13:03:16 4
  • On Linux substitute pbpaste with `xsel --clipboard --output` or `xclip -selection clipboard -o` (untested)


    0
    pbpaste | xargs wget
    loopkid · 2013-08-11 23:12:10 6
  • This script can be used to download enclosed files from a RSS feed. For example, it can be used to download mp3 files from a podcasts RSS feed. Show Sample Output


    0
    wget -q -O- http://example-podcast-feed.com/rss | grep -o "<enclosure[ -~][^>]*" | grep -o "http://[ -~][^\"]*" | xargs wget -c
    talha131 · 2013-09-24 12:38:08 21

  • 0
    wget -q -O- http://bitinfocharts.com/markets/btc-e/btc-usd.html |grep -o -P 'lastTrade">([0-9]{1,})(.){0,1}[0-9]{0,}' |grep -o -P '(\d)+(\.){0,1}(\d)*' |head -n 1
    peter1337 · 2014-01-25 23:40:00 20
  • This is the command line I use to get my IP address in order to update my zoneedit account. Full script on my blog http://akim.sissaoui.com/linux-attitude/script-de-mise-a-jour-ddns-zoneedit-com-en-bashsh/ Show Sample Output


    0
    $ wget --no-check-certificate -q checkip.dyndns.org -O index.html && cat index.html|cut -d ' ' -f 6 | cut -d '<' -f 1
    Superkikim · 2014-05-12 07:10:29 8
  • Just pulls a quote for each day and displays it in a notification bubble... or you can change it a bit and just have it run in the terminal wget -q -O "quote" https://www.goodreads.com/quotes_of_the_day;echo "Quote of the Day";cat quote | grep '&ldquo;\|/author/show' | sed -e 's/<[a-zA-Z\/][^>]*>//g' | sed 's/&ldquo;//g' | sed 's/&rdquo;//g'; rm -f quote Show Sample Output


    0
    wget -q -O "quote" https://www.goodreads.com/quotes_of_the_day;notify-send "$(echo "Quote of the Day";cat quote | grep '&ldquo;\|/author/show' | sed -e 's/<[a-zA-Z\/][^>]*>//g' | sed 's/&ldquo;//g' | sed 's/&rdquo;//g')"; rm -f quote
    nowhereman88 · 2014-06-15 03:17:19 61
  •  < 1 2 3 4 5 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Visit wikileaks.com
Who needs a DNS server

resume other user's screen session via su, without pty error
Normally, if you su to another user from root and try to resume that other user's screen session, you will get an error like "Cannot open your terminal '/dev/pts/0' - please check." This is because the other user doesn't have permission for root's pty. You can get around this by running a "script" session as the new user, before trying to resume the screen session. Note you will have to execute each of the three commands separately, not all on the same line as shown here. Credit: I found this at http://www.hjackson.org/blog/archives/2008/11/29/cannot-open-your-terminal-dev-pts-please-check.

Place the NUM-th argument of the most recent command on the shell
After executing a command with multiple arguments like cp ./temp/test.sh ~/prog/ifdown.sh you can paste any argument of the previous command to the console, like ls -l ALT+1+. is equivalent to ls -l ./temp/test.sh ALT+0+. stands for command itself ('ls' in this case) Simple ALT+. cycles through last arguments of previous commands.

Get your external IP address without curl
Curl is not installed by default on many common distros anymore. wget always is :) $ wget -qO- ifconfig.me/ip

power off system in X hours form the current time, here X=2

awk date convert
Convert readable date/time with `date` command

Alias TAIL for automatic smart output
Run the alias command, then issue $ps aux | tail and resize your terminal window (putty/console/hyperterm/xterm/etc) then issue the same command and you'll understand. $ ${LINES:-`tput lines 2>/dev/null||echo -n 12`} Insructs the shell that if LINES is not set or null to use the output from `tput lines` ( ncurses based terminal access ) to get the number of lines in your terminal. But furthermore, in case that doesn't work either, it will default to using the default of 80. The default for TAIL is to output the last 10 lines, this alias changes the default to output the last x lines instead, where x is the number of lines currently displayed on your terminal - 7. The -7 is there so that the top line displayed is the command you ran that used TAIL, ie the prompt. Depending on whether your PS1 and/or PROMPT_COMMAND output more than 1 line (mine is 3) you will want to increase from -2. So with my prompt being the following, I need -7, or - 5 if I only want to display the commandline at the top. ( http://www.askapache.com/linux/bash-power-prompt.html ) 275MB/748MB [7995:7993 - 0:186] 06:26:49 Thu Apr 08 [askapache@n1-backbone5:/dev/pts/0 +1] ~ $ In most shells the LINES variable is created automatically at login and updated when the terminal is resized (28 linux, 23/20 others for SIGWINCH) to contain the number of vertical lines that can fit in your terminal window. Because the alias doesn't hard-code the current LINES but relys on the $LINES variable, this is a dynamic alias that will always work on a tty device.

generate a unique and secure password for every website that you login to
usage: sitepass MaStErPaSsWoRd example.com description: An admittedly excessive amount of hashing, but this will give you a pretty secure password, It also eliminates repeated characters and deletes itself from your command history. tr '!-~' 'P-~!-O' # this bit is rot47, kinda like rot13 but more nerdy rev # this avoids the first few bytes of gzip payload, and the magic bytes.

Route outbound SMTP connections through a addtional IP address rather than your primary


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: