Commands using wget (286)

  • A function for retrieving and displaying a list of synonyms for a German word or phrase. Show Sample Output


    0
    desyno(){ wget -q -O- https://www.openthesaurus.de/synonyme/search\?q\="$*"\&format\=text/xml | sed 's/>/>\n/g' | grep "<term term=" | cut -d \' -f 2 | paste -s -d , | sed 's/,/, /g' | fold -s -w $(tput cols); }
    lordtoran · 2019-02-09 05:06:42 32
  • Get newest kernel version by parsing the most bleeding-edge Makefile possible. Useful for doing things like writing live ebuilds and/or self-updating PKGBUILDs for testing purposes. Breakdown: * wget -qO - https://raw.githubusercontent.com/torvalds/linux/master/Makefile — retrieve Makefile and pipe to stdout * head -n5 — only the first 5 lines are relevant, that's where all the version variables are grep -E '\ \=\ [0-9]{1,}' — version variables always have an equals sign followed by a number * cut -d' ' -f3 — extract the individual numbers from the version variables * tr '\n' '.' — replace newlines with periods * sed -e "s/\.$// — remove trailing period Show Sample Output


    0
    wget -qO - https://raw.githubusercontent.com/torvalds/linux/master/Makefile | head -n5 | grep -E '\ \=\ [0-9]{1,}' | cut -d' ' -f3 | tr '\n' '.' | sed -e "s/\.$//"
    realkstrawn93 · 2021-04-27 17:12:05 419
  • just a alternative using a saved html file of all of my bookmarks. works well although it takes awhile.


    -1
    wget -r --wait=5 --quota=5000m --tries=3 --directory-prefix=/home/erin/Documents/erins_webpages --limit-rate=20k --level=1 -k -p -erobots=off -np -N --exclude-domains=del.icio.us,doubleclick.net -F -i ./delicious-20090629.htm
    bbelt16ag · 2009-07-02 01:46:21 7

  • -1
    wget -O - http://checkip.dyndns.org|sed 's/[^0-9.]//g'
    thundernode · 2009-08-06 12:47:32 5
  • From Hong Kong Observatory wap site ;) Show Sample Output


    -1
    wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | grep -v '^$'
    twfcc · 2009-09-25 02:21:05 6
  • "get Hong Kong weather infomation from HK Observatory From Hong Kong Observatory wap site ;)" other one showed alot of blank lines for me Show Sample Output


    -1
    wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | tail -n4
    dakunesu · 2009-09-25 02:36:46 3
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3
  • Bash scrip to test if a server is up, you can use this before wget'ing a file to make sure a blank one isn't downloaded.


    -1
    if [ "$(ping -q -c1 google.com)" ];then wget -mnd -q http://www.google.com/intl/en_ALL/images/logo.gif ;fi
    alf · 2010-03-23 04:15:03 10

  • -1
    wget -qO - "http://ajax.googleapis.com/ajax/services/language/translate?langpair=|zh-cn&v=1.0&q=`xsel`" |cut -d \" -f 6
    unixmonkey10186 · 2010-06-05 18:49:59 4

  • -1
    wget --load-cookies <cookie-file> -c -i <list-of-urls>
    alriode · 2010-07-12 02:35:21 3
  • Wgets "whatismyip" from checkip.dyndns.org and filters out the actual IP-adress. Usefull when you quickly need to find the outward facting IP-address of your current location. Show Sample Output


    -1
    wget --quiet -O - checkip.dyndns.org | sed -e 's/[^:]*: //' -e 's/<.*$//'
    berkes · 2010-08-01 13:36:08 3

  • -1
    wget -qO - http://i18n.counter.li.org/ | grep 'users registered' | sed 's/.*\<font size=7\>//g' | tr '\>' ' ' | sed 's/<br.*//g' | tr ' ' '\0'
    hunterm · 2010-10-07 03:19:17 4
  • some other options: &delay=1000 &mode=links much more with piggybank as scraper works well with your favourite curses or non-curses http clients


    -1
    svn co http://simile.mit.edu/repository/crowbar/trunk&& cd ./trunk/xulapp/ xulrunner --install-app && Xvfb :1 && DISPLAY=:1 xulrunner application.ini 2>/dev/null 1>/dev/null && wget -O- "127.0.0.1:10000/&url=http://www.facebook.com"
    argv · 2010-10-16 05:12:11 3
  • Just added view with the eog viewer.


    -1
    wget -O xkcd_$(date +%y-%m-%d).png `lynx --dump http://xkcd.com/|grep png`; eog xkcd_$(date +%y-%m-%d).png
    theanalyst · 2010-10-27 13:42:55 3
  • Is a simple script for video streaming a movie


    -1
    cat video.ogg | nc -l -p 4232 & wget http://users.bshellz.net/~bazza/?nombre=name -O - & sleep 10; mplayer http://users.bshellz.net/~bazza/datos/name.ogg
    el_bazza · 2010-11-29 03:34:31 5
  • Substitute that 724349691704 with an UPC of a CD you have at hand, and (hopefully) this oneliner should return the $Artist - $Title, querying discogs.com. Yes, I know, all that head/tail/grep crap can be improved with a single sed command, feel free to send "patches" :D Enjoy! Show Sample Output


    -1
    wget http://www.discogs.com/search?q=724349691704 -O foobar &> /dev/null ; grep \/release\/ foobar | head -2 | tail -1 | sed -e 's/^<div>.*>\(.*\)<\/a><\/div>/\1/' ; rm foobar
    TetsuyO · 2011-01-30 23:34:54 3
  • I wanted to play a song from the shell and get the shell back, I also dont want to store the file if it is not needed. edit, not sure if I need to mention it... killall vlc to stop it Show Sample Output


    -1
    wget http://somesite.com/somestream.pls; cvlc somestream.pls&sleep 5; rm somestream.pls*
    tomjrace · 2011-08-04 19:24:18 3
  • This uses wget instead of curl


    -1
    wget -q -O - http://www.perl.org/get.html | grep -m1 '\.tar\.gz' | sed 's/.*perl-//; s/\.tar\.gz.*//'
    dbbolton · 2011-08-19 23:38:10 3

  • -1
    wget ifconfig.me/ip -q -O -
    DamirX · 2011-11-30 08:30:35 3
  • Grabs the current weather in your area (or their best guess of your area). Change the query to your zip code/location (e.g. google.com/search?q=weather+jakarta,+india) to get weather somewhere else. change google.com to google.ca or google.co.uk for metric. Show Sample Output


    -1
    wget -qO- -U '' 'google.com/search?q=weather' | grep -oP '(-)?\d{1,3}\xB0[FC]'
    slaufer · 2012-02-28 22:27:38 362

  • -1
    for i in $(wget -O- -U "" "http://wallbase.cc/random/23/e..." --quiet|grep wallpaper/|grep -oe 'http://wallbase.cc[^"]*'); do wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done
    mama21mama · 2012-08-19 11:06:30 20
  • This command should be copy-pasted in Windows, but very similar one will work on Linux. It uses wget and sed.


    -1
    wget --no-check-certificate https://code.google.com/p/msysgit/downloads/list -O - 2>nul | sed -n "0,/.*\(\/\/msysgit.googlecode.com\/files\/Git-.*\.exe\).*/s//http:\1/p" | wget -i - -O Git-Latest.exe
    michfield · 2012-11-14 08:17:50 8
  • Sweep and download all mp3 (in French) of "Rendez-vous avec X" (Meet with M. X) of French public radio From 1997 http://rendezvousavecmrx.free.fr/audio/mr_x_1997_01_04.mp3 To 2015 http://rendezvousavecmrx.free.fr/audio/mr_x_2015_06_20.mp3 Balaye et telecharge les episodes depuis 1997 http://rendezvousavecmrx.free.fr/audio/mr_x_1997_01_04.mp3 jusqu'en 2015 http://rendezvousavecmrx.free.fr/audio/mr_x_2015_06_20.mp3


    -1
    wget http://rendezvousavecmrx.free.fr/audio/mr_x_{1997..2015}_{01..12}_{01..31}.mp3
    pascalvaucheret · 2015-08-13 21:34:50 12
  • I've got this posted in one of my .bash_profiles for humor whenever I log in. Show Sample Output


    -2
    wget -qO - snubster.com|sed -n '65p'|awk 'gsub(/<span><br>.*/,"")&&1'|perl -p -e 's:myScroller1.addItem\("<span class=atHeaderOrange>::g;s:</span> <span class=snubFontSmall>::g;s:&quot;:":g;s:^:\n:g;s:$:\n:'
    sil · 2009-02-18 15:05:13 6

  • -2
    wget http://checkip.dyndns.org && clear && echo && echo My IP && egrep -o '([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}' index.html && echo && rm index.html
    onkelchentobi · 2009-08-07 21:21:59 4
  • ‹ First  < 8 9 10 11 12 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find the package that installed a command

list all files in a directory, sorted in reverse order by modification time, use file descriptors.
It's both silly, and infinitely useful. Especially useful in logfile directories where you want to know what file is being updated while troubleshooting.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

BASH: Print shell variable into AWK

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Count the total number of files in each immediate subdirectory
counts the total (recursive) number of files in the immediate (depth 1) subdirectories as well as the current one and displays them sorted. Fixed, as per ashawley's comment

for too many arguments by *
$ grep ERROR *.log -bash: /bin/grep: Argument list too long $ echo *.log | xargs grep ERROR /dev/null 20090119.00011.log:DANGEROUS ERROR

find out which directory uses most inodes - list total sum of directoryname existing on filesystem

A signal trap that logs when your script was killed and what other processes were running at that time
trap is the bash builtin that allows you to execute commands when the current script receives a particular signal. Uses $0 for the script name, $$ for the script PID, tee to output to STDOUT as well as a log file and ps to log other running processes.

take a look to command before action
add |sh when you agree the list, I often use that method to prevent typos in dangerous or long operations


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: