Commands using wget (286)

  • wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" this does the actual google dictionary query, returns a JSON string encapsulated in some fancy tag sed 's/dict_api\.callbacks.id100.//' here we remove the tag beginning sed 's/,200,null)//' and here the tag end There are also some special characters which could cause problems with some JSON parsers, so if you get some errors, this is probably the case (sed is your friend). I laso like to trim the "webDefinitions" part, because it (sometimes) contains misleading information. sed 's/\,\"webDefinitions.*//' (but remember to append a "}" at the end, because the JSON string will be invalid) The output also contains links to mp3 files with pronounciation. As of now, this is only usable in the English language. If you choose other than English, you will only get webDefinitions (which are crap).


    1
    wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" | sed 's/dict_api\.callbacks.id100.//' | sed 's/,200,null)//'
    sairon · 2011-03-08 15:00:39 16
  • put your link [url] to check if exist the remote file Show Sample Output


    1
    wget -O/dev/null -q URLtoCheck && echo exists || echo not exist
    xeonproject · 2011-04-07 20:55:33 3
  • This example command fetches 'example.com' webpage and then fetches+saves all PDF files listed (linked to) on that webpage. [*Note: of course there are no PDFs on example.com. This is just an example]


    1
    curl -s http://example.com | grep -o -P "<a.*href.*>" | grep -o "http.*.pdf" | xargs -d"\n" -n1 wget -c
    b_t · 2011-06-09 14:42:46 5

  • 1
    NAME=`wget --quiet URL -O - | grep util-vserver | tail -n 1 | sed 's|</a>.*||;s/.*>//'`; wget URL$UTILVSERVER;
    WMP · 2011-07-17 13:01:20 4
  • ?mirror : turn on options suitable for mirroring. -p : download all files that are necessary to properly display a given HTML page. ?convert-links : after the download, convert the links in document for local viewing. -P ./LOCAL-DIR : save all the files and directories to the specified directory.


    1
    $ wget --mirror -p --convert-links -P ./<LOCAL-DIR> <WEBSITE-URL>
    tkembo · 2011-08-18 08:27:28 6
  • Decrypt MD5 , replace 1cb251ec0d568de6a929b520c4aed8d1 with the MD5 string you want to decrypt Show Sample Output


    1
    wget -qO - --post-data "data[Row][cripted]=1cb251ec0d568de6a929b520c4aed8d1" http://md5-decrypter.com/ | grep -A1 "Decrypted text" | tail -n1 | cut -d '"' -f3 | sed 's/>//g; s/<\/b//g'
    samhagin · 2011-10-13 03:48:54 3

  • 1
    wget -A mp3,mpg,mpeg,avi -r -l 3 http://www.site.com/
    kev · 2011-11-09 10:06:07 3
  • Make your own MP3s from Youtube videos. Show Sample Output


    1
    url="put_url_here";audio=$(youtube-dl -s -e $url);wget -q -O - `youtube-dl -g $url`| ffmpeg -i - -f mp3 -vn -acodec libmp3lame - > "$audio.mp3"
    o0110o · 2011-11-15 19:09:52 4

  • 1
    wget -O chart.png 'http://chart.googleapis.com/chart?chs=250x100&chd=t:60,40&cht=p3&chl=Hello|World'
    kev · 2011-12-10 18:03:16 59

  • 1
    wget -nd -r -l 2 -A jpg,jpeg,png,gif http://website-url.com
    unixmonkey26318 · 2012-01-27 11:06:50 8
  • A simple script for download all the MegaTokyo strips from the first to the last one


    1
    for i in $(seq 1 `curl http://megatokyo.com 2>/dev/null|grep current|cut -f6 -d\"`);do wget http://megatokyo.com/`curl http://megatokyo.com/strip/${i} 2>/dev/null|grep src=\"strips\/|cut -f4 -d\"`;done
    akira88 · 2012-03-04 22:52:36 9

  • 1
    wget -m -k -K -E http://url/of/web/site
    joedistro · 2012-03-19 20:22:05 3

  • 1
    cat urls.txt | wget -i- -T 10 -t 3 --waitretry 1
    kev · 2012-05-14 06:41:14 9

  • 1
    while pgrep wget || sudo shutdown -P now; do sleep 1m; done
    kev · 2012-05-20 17:49:56 7
  • In this example, where the users gpg keyring has a password, the user will be interactively prompted for the keyring password. If the keyring has no password, same as above, sans the prompt. Suitable for cron jobs. ~/.gnupg/passwd/http-auth.gpg is the encrypted http auth password, for this particular wget use case. This approach has many use cases. example bash functions: function http_auth_pass() { gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null; } function decrypt_pass() { gpg2 --decrypt ~/.gnupg/passwd/"$1" 2>/dev/null; }


    1
    wget --input-file=~/donwloads.txt --user="$USER" --password="$(gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null)"
    kyle0r · 2012-12-13 00:14:55 6
  • If you have to deal with MS Sharepoint which is (rarely, let's hope) used in e.g. certain corporate environments). This uses Cntlm. For single files, just use cURL -- its NTLM authentication works quite well. # /etc/cntlm.conf: # Username account # Domain domain # Password ############ # Proxy 10.20.30.40 (IP of the sharepoint site) # NoProxy * # Listen 3128


    1
    http_proxy=http://127.0.0.1:3128 wget --http-user='domain\account' --http-password='###' -p -r -l 8 --no-remove-listing -P . 'http://sp.corp.com/teams/Team/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fteams%2fTeam%2fShared%20Documents%2fFolder'
    mhs · 2012-12-26 09:03:55 4
  • alias speedtest='wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip'


    1
    wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip
    opexxx · 2013-03-15 13:25:07 6
  • Need to find a Mageia Linux mirror server providing Mageia 4 via rsync? Modify the "url=" string for the version you want. This shows i586 which is the 32bit version. If you want the 64bit version it is: url=http://mirrors.mageia.org/api/mageia.4.x86_64.list; wget -q ${url} -O - | grep rsync: Show Sample Output


    1
    url=http://mirrors.mageia.org/api/mageia.4.i586.list; wget -q ${url} -O - | grep rsync:
    mpb · 2013-05-20 16:19:05 7
  • First (and only) argument should be a 4chan thread URL.


    1
    function 4chandl () { wget -e robots=off -nvcdp -t 0 -Hkrl 0 -I \*/src/ -P . "$1" }
    89r · 2013-07-28 11:29:53 6
  • Returns your external IP address to the command line using only wget Show Sample Output


    1
    wget http://ipecho.net/plain -O - -q ; echo
    JonathanFisher · 2013-10-02 21:18:40 9
  • Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :) wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)" Show Sample Output


    1
    wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us"|awk '/driverResults.aspx/ {print $4}'|cut -d "'" -f2|head -n 1)"|awk '/url=/ {print $2}'|cut -d '=' -f3|cut -d '&' -f1)"
    lowjax · 2013-11-21 03:04:59 11

  • 1
    read -p "Please enter the 4chan url: "|egrep '//i.4cdn.org/[a-z0-9]+/src/([0-9]*).(jpg|png|gif)' - -o|nl -s https:|cut -c7-|uniq|wget -nc -i - --random-wait
    unixmonkey73764 · 2014-03-09 05:56:14 6

  • 1
    wget -r -P ./dl/ -A jpg,jpeg http://captivates.com
    ferdous · 2014-06-14 17:28:32 7
  • No need to parse html page, website gives us a txt file :)


    1
    wget -qO- http://whatthecommit.com/index.txt | cowsay
    optyler · 2014-08-26 18:56:06 8
  • If the version already downloaded. it will not download again Show Sample Output


    1
    wget -N --content-disposition http://www.adminer.org/latest.php
    rickyok · 2014-09-12 07:52:45 8
  • ‹ First  < 4 5 6 7 8 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Display IP adress of the given interface in a most portable and reliable way. That should works on many platforms.
Thanks to comment if that works or not... If you have already typed that snippet or you know you already have IO::Interface::Simple perl module, you can type only the last command : $ perl -e 'use IO::Interface::Simple; my $ip=IO::Interface::Simple->new($ARGV[0]); print $ip->address,$/;' ( The first perl command will install the module if it's not there already... )

Look at your data as a greymap image.
Keep width to a power of 2 to see patterns emerge. 512 is good. So is 4096 for huge maps. PNM headers are super basic. http://netpbm.sourceforge.net/doc/pbm.html

Increase SCT of external USB disk enclosure to one hour.
So I had this 2TB Seagate external disk/USB enclosure which by default would spin-down its internal drive (it enters a standby mode) after four minutes of inactivity.. Spinning-up the inactive drive was an annoying delay when accessing files and also it severely interfered with NFS.. SCT stands for "Standby Condition Timer". To completely disable SCT: $ sdparm --clear STANDBY -6 /dev/sdb To return to original (default) SCT settings: $ sdparm -D -p 0x1a -6 /dev/sdb To verify the settings (before and after): $ sdparm -a /dev/sdb No need for vendor-provided MSWIN tools, etc.

Comment out a line in a file
This will comment out a line, specified by line number, in a given file.

Exclude svn directories with grep
exclude-dir option requires grep 2.5.3

Quickly analyse an Apache error log
This searches the Apache error_log for each of the 5 most significant Apache error levels, if any are found the date is then cut from the output in order to sort then print the most common occurrence of each error.

Output files without comments or empty lines
Filter comments and empty lines in files. I find this very useful when trying to find what values are actually set in a very long example config file. I often set an alias for it, like : alias nocomment='grep -v "^\($\|#\)"'

list files recursively by size

awk date convert
Convert readable date/time with `date` command

Show a passive popup in KDE
Display a passive popup during seconds. Additionnaly, --title can be used to set the title of the popup. This is a nice way to communicate with a desktop user of a machine you have an SSH access on : DISPLAY=:0 sudo -u $user -H kdialog --passivepopup "Hello you" 10 --title "cli IM"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: