Commands using wget (286)

  • A function for retrieving and displaying a list of synonyms for a German word or phrase. Show Sample Output


    0
    desyno(){ wget -q -O- https://www.openthesaurus.de/synonyme/search\?q\="$*"\&format\=text/xml | sed 's/>/>\n/g' | grep "<term term=" | cut -d \' -f 2 | paste -s -d , | sed 's/,/, /g' | fold -s -w $(tput cols); }
    lordtoran · 2019-02-09 05:06:42 32
  • Get newest kernel version by parsing the most bleeding-edge Makefile possible. Useful for doing things like writing live ebuilds and/or self-updating PKGBUILDs for testing purposes. Breakdown: * wget -qO - https://raw.githubusercontent.com/torvalds/linux/master/Makefile — retrieve Makefile and pipe to stdout * head -n5 — only the first 5 lines are relevant, that's where all the version variables are grep -E '\ \=\ [0-9]{1,}' — version variables always have an equals sign followed by a number * cut -d' ' -f3 — extract the individual numbers from the version variables * tr '\n' '.' — replace newlines with periods * sed -e "s/\.$// — remove trailing period Show Sample Output


    0
    wget -qO - https://raw.githubusercontent.com/torvalds/linux/master/Makefile | head -n5 | grep -E '\ \=\ [0-9]{1,}' | cut -d' ' -f3 | tr '\n' '.' | sed -e "s/\.$//"
    realkstrawn93 · 2021-04-27 17:12:05 419
  • just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.


    -1
    wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
    bbelt16ag · 2009-07-02 01:46:21 7

  • -1
    wget -O - http://checkip.dyndns.org|sed 's/[^0-9.]//g'
    thundernode · 2009-08-06 12:47:32 5
  • From Hong Kong Observatory wap site ;) Show Sample Output


    -1
    wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | grep -v '^$'
    twfcc · 2009-09-25 02:21:05 6
  • "get Hong Kong weather infomation from HK Observatory From Hong Kong Observatory wap site ;)" other one showed alot of blank lines for me Show Sample Output


    -1
    wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | tail -n4
    dakunesu · 2009-09-25 02:36:46 3
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3
  • Bash scrip to test if a server is up, you can use this before wget'ing a file to make sure a blank one isn't downloaded.


    -1
    if [ "$(ping -q -c1 google.com)" ];then wget -mnd -q http://www.google.com/intl/en_ALL/images/logo.gif ;fi
    alf · 2010-03-23 04:15:03 10

  • -1
    wget -qO - "http://ajax.googleapis.com/ajax/services/language/translate?langpair=|zh-cn&v=1.0&q=`xsel`" |cut -d \" -f 6
    unixmonkey10186 · 2010-06-05 18:49:59 4

  • -1
    wget --load-cookies <cookie-file> -c -i <list-of-urls>
    alriode · 2010-07-12 02:35:21 3
  • Wgets "whatismyip" from checkip.dyndns.org and filters out the actual IP-adress. Usefull when you quickly need to find the outward facting IP-address of your current location. Show Sample Output


    -1
    wget --quiet -O - checkip.dyndns.org | sed -e 's/[^:]*: //' -e 's/<.*$//'
    berkes · 2010-08-01 13:36:08 3

  • -1
    wget -qO - http://i18n.counter.li.org/ | grep 'users registered' | sed 's/.*\<font size=7\>//g' | tr '\>' ' ' | sed 's/<br.*//g' | tr ' ' '\0'
    hunterm · 2010-10-07 03:19:17 4
  • some other options: &delay=1000 &mode=links much more with piggybank as scraper works well with your favourite curses or non-curses http clients


    -1
    svn co http://simile.mit.edu/repository/crowbar/trunk&& cd ./trunk/xulapp/ xulrunner --install-app && Xvfb :1 && DISPLAY=:1 xulrunner application.ini 2>/dev/null 1>/dev/null && wget -O- "127.0.0.1:10000/&url=http://www.facebook.com"
    argv · 2010-10-16 05:12:11 3
  • Just added view with the eog viewer.


    -1
    wget -O xkcd_$(date +%y-%m-%d).png `lynx --dump http://xkcd.com/|grep png`; eog xkcd_$(date +%y-%m-%d).png
    theanalyst · 2010-10-27 13:42:55 3
  • Is a simple script for video streaming a movie


    -1
    cat video.ogg | nc -l -p 4232 & wget http://users.bshellz.net/~bazza/?nombre=name -O - & sleep 10; mplayer http://users.bshellz.net/~bazza/datos/name.ogg
    el_bazza · 2010-11-29 03:34:31 5
  • Substitute that 724349691704 with an UPC of a CD you have at hand, and (hopefully) this oneliner should return the $Artist - $Title, querying discogs.com. Yes, I know, all that head/tail/grep crap can be improved with a single sed command, feel free to send "patches" :D Enjoy! Show Sample Output


    -1
    wget http://www.discogs.com/search?q=724349691704 -O foobar &> /dev/null ; grep \/release\/ foobar | head -2 | tail -1 | sed -e 's/^<div>.*>\(.*\)<\/a><\/div>/\1/' ; rm foobar
    TetsuyO · 2011-01-30 23:34:54 3
  • I wanted to play a song from the shell and get the shell back, I also dont want to store the file if it is not needed. edit, not sure if I need to mention it... killall vlc to stop it Show Sample Output


    -1
    wget http://somesite.com/somestream.pls; cvlc somestream.pls&sleep 5; rm somestream.pls*
    tomjrace · 2011-08-04 19:24:18 3
  • This uses wget instead of curl


    -1
    wget -q -O - http://www.perl.org/get.html | grep -m1 '\.tar\.gz' | sed 's/.*perl-//; s/\.tar\.gz.*//'
    dbbolton · 2011-08-19 23:38:10 3

  • -1
    wget ifconfig.me/ip -q -O -
    DamirX · 2011-11-30 08:30:35 3
  • Grabs the current weather in your area (or their best guess of your area). Change the query to your zip code/location (e.g. google.com/search?q=weather+jakarta,+india) to get weather somewhere else. change google.com to google.ca or google.co.uk for metric. Show Sample Output


    -1
    wget -qO- -U '' 'google.com/search?q=weather' | grep -oP '(-)?\d{1,3}\xB0[FC]'
    slaufer · 2012-02-28 22:27:38 362

  • -1
    for i in $(wget -O- -U "" "http://wallbase.cc/random/23/e..." --quiet|grep wallpaper/|grep -oe 'http://wallbase.cc[^"]*'); do wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done
    mama21mama · 2012-08-19 11:06:30 20
  • This command should be copy-pasted in Windows, but very similar one will work on Linux. It uses wget and sed.


    -1
    wget --no-check-certificate https://code.google.com/p/msysgit/downloads/list -O - 2>nul | sed -n "0,/.*\(\/\/msysgit.googlecode.com\/files\/Git-.*\.exe\).*/s//http:\1/p" | wget -i - -O Git-Latest.exe
    michfield · 2012-11-14 08:17:50 8
  • Sweep and download all mp3 (in French) of "Rendez-vous avec X" (Meet with M. X) of French public radio From 1997 http://rendezvousavecmrx.free.fr/audio/mr_x_1997_01_04.mp3 To 2015 http://rendezvousavecmrx.free.fr/audio/mr_x_2015_06_20.mp3 Balaye et telecharge les episodes depuis 1997 http://rendezvousavecmrx.free.fr/audio/mr_x_1997_01_04.mp3 jusqu'en 2015 http://rendezvousavecmrx.free.fr/audio/mr_x_2015_06_20.mp3


    -1
    wget http://rendezvousavecmrx.free.fr/audio/mr_x_{1997..2015}_{01..12}_{01..31}.mp3
    pascalvaucheret · 2015-08-13 21:34:50 12
  • I've got this posted in one of my .bash_profiles for humor whenever I log in. Show Sample Output


    -2
    wget -qO - snubster.com|sed -n '65p'|awk 'gsub(/<span><br>.*/,"")&&1'|perl -p -e 's:myScroller1.addItem\("<span class=atHeaderOrange>::g;s:</span> <span class=snubFontSmall>::g;s:&quot;:":g;s:^:\n:g;s:$:\n:'
    sil · 2009-02-18 15:05:13 6

  • -2
    wget http://checkip.dyndns.org && clear && echo && echo My IP && egrep -o '([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}' index.html && echo && rm index.html
    onkelchentobi · 2009-08-07 21:21:59 4
  • ‹ First  < 8 9 10 11 12 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Archive all SVN repositories in platform indepenent form
This command dumps all SVN repositories inside of folder "repMainPath" (not recursively) to the folder "dumpPath", where one dump file will be created for each SVN repository.

find and delete empty dirs, start in current working dir
A quick way to find and delete empty dirs, it starts in the current working directory. If you do find . -empty -type d you will see what could be removed, or to a test run.

pipe output of a command to your clipboard
In turn you can get the contents of your clipboard by typing xsel by itself with no arguments: $ xsel This command requires you to install the xsel utility which is free

Remove all mail in Postfix mail queue.

list block devices
Shows all block devices in a tree with descruptions of what they are.

Find broken symlinks

Get a Bulleted List of SVN Commits By a User for a Specifc Day (Daily Work Log)
* Replace USERNAME with the desired svn username * Replace the first YYYY-MM-DD with the date you want to get the log (this starts at the midnight event that starts this date) * Replace the second YYYY-MM-DD with the date after you want to get the log (this will end the log scan on midnight of the previous day) Example, if I want the log for December 10, 2010, I would put {2010-12-10}:{2010-12-11}

gzip over ssh
I've kept the gzip compression at a low level, but depending on the cpu power available on the source machine you may want to increase it. However, SQL compresses really well, and I found even with -1 I was able to transfer 40 MiB/s over a 100 mbps wire, which was good enough for me.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: