Commands using wget (286)

  • wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" this does the actual google dictionary query, returns a JSON string encapsulated in some fancy tag sed 's/dict_api\.callbacks.id100.//' here we remove the tag beginning sed 's/,200,null)//' and here the tag end There are also some special characters which could cause problems with some JSON parsers, so if you get some errors, this is probably the case (sed is your friend). I laso like to trim the "webDefinitions" part, because it (sometimes) contains misleading information. sed 's/\,\"webDefinitions.*//' (but remember to append a "}" at the end, because the JSON string will be invalid) The output also contains links to mp3 files with pronounciation. As of now, this is only usable in the English language. If you choose other than English, you will only get webDefinitions (which are crap).


    1
    wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" | sed 's/dict_api\.callbacks.id100.//' | sed 's/,200,null)//'
    sairon · 2011-03-08 15:00:39 17
  • put your link [url] to check if exist the remote file Show Sample Output


    1
    wget -O/dev/null -q URLtoCheck && echo exists || echo not exist
    xeonproject · 2011-04-07 20:55:33 4
  • This example command fetches 'example.com' webpage and then fetches+saves all PDF files listed (linked to) on that webpage. [*Note: of course there are no PDFs on example.com. This is just an example]


    1
    curl -s http://example.com | grep -o -P "<a.*href.*>" | grep -o "http.*.pdf" | xargs -d"\n" -n1 wget -c
    b_t · 2011-06-09 14:42:46 6

  • 1
    NAME=`wget --quiet URL -O - | grep util-vserver | tail -n 1 | sed 's|</a>.*||;s/.*>//'`; wget URL$UTILVSERVER;
    WMP · 2011-07-17 13:01:20 5
  • ?mirror : turn on options suitable for mirroring. -p : download all files that are necessary to properly display a given HTML page. ?convert-links : after the download, convert the links in document for local viewing. -P ./LOCAL-DIR : save all the files and directories to the specified directory.


    1
    $ wget --mirror -p --convert-links -P ./<LOCAL-DIR> <WEBSITE-URL>
    tkembo · 2011-08-18 08:27:28 7
  • Decrypt MD5 , replace 1cb251ec0d568de6a929b520c4aed8d1 with the MD5 string you want to decrypt Show Sample Output


    1
    wget -qO - --post-data "data[Row][cripted]=1cb251ec0d568de6a929b520c4aed8d1" http://md5-decrypter.com/ | grep -A1 "Decrypted text" | tail -n1 | cut -d '"' -f3 | sed 's/>//g; s/<\/b//g'
    samhagin · 2011-10-13 03:48:54 4

  • 1
    wget -A mp3,mpg,mpeg,avi -r -l 3 http://www.site.com/
    kev · 2011-11-09 10:06:07 4
  • Make your own MP3s from Youtube videos. Show Sample Output


    1
    url="put_url_here";audio=$(youtube-dl -s -e $url);wget -q -O - `youtube-dl -g $url`| ffmpeg -i - -f mp3 -vn -acodec libmp3lame - > "$audio.mp3"
    o0110o · 2011-11-15 19:09:52 5

  • 1
    wget -O chart.png 'http://chart.googleapis.com/chart?chs=250x100&chd=t:60,40&cht=p3&chl=Hello|World'
    kev · 2011-12-10 18:03:16 61

  • 1
    wget -nd -r -l 2 -A jpg,jpeg,png,gif http://website-url.com
    unixmonkey26318 · 2012-01-27 11:06:50 9
  • A simple script for download all the MegaTokyo strips from the first to the last one


    1
    for i in $(seq 1 `curl http://megatokyo.com 2>/dev/null|grep current|cut -f6 -d\"`);do wget http://megatokyo.com/`curl http://megatokyo.com/strip/${i} 2>/dev/null|grep src=\"strips\/|cut -f4 -d\"`;done
    akira88 · 2012-03-04 22:52:36 12

  • 1
    wget -m -k -K -E http://url/of/web/site
    joedistro · 2012-03-19 20:22:05 4

  • 1
    cat urls.txt | wget -i- -T 10 -t 3 --waitretry 1
    kev · 2012-05-14 06:41:14 10

  • 1
    while pgrep wget || sudo shutdown -P now; do sleep 1m; done
    kev · 2012-05-20 17:49:56 9
  • In this example, where the users gpg keyring has a password, the user will be interactively prompted for the keyring password. If the keyring has no password, same as above, sans the prompt. Suitable for cron jobs. ~/.gnupg/passwd/http-auth.gpg is the encrypted http auth password, for this particular wget use case. This approach has many use cases. example bash functions: function http_auth_pass() { gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null; } function decrypt_pass() { gpg2 --decrypt ~/.gnupg/passwd/"$1" 2>/dev/null; }


    1
    wget --input-file=~/donwloads.txt --user="$USER" --password="$(gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null)"
    kyle0r · 2012-12-13 00:14:55 7
  • If you have to deal with MS Sharepoint which is (rarely, let's hope) used in e.g. certain corporate environments). This uses Cntlm. For single files, just use cURL -- its NTLM authentication works quite well. # /etc/cntlm.conf: # Username account # Domain domain # Password ############ # Proxy 10.20.30.40 (IP of the sharepoint site) # NoProxy * # Listen 3128


    1
    http_proxy=http://127.0.0.1:3128 wget --http-user='domain\account' --http-password='###' -p -r -l 8 --no-remove-listing -P . 'http://sp.corp.com/teams/Team/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fteams%2fTeam%2fShared%20Documents%2fFolder'
    mhs · 2012-12-26 09:03:55 8
  • alias speedtest='wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip'


    1
    wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip
    opexxx · 2013-03-15 13:25:07 8
  • Need to find a Mageia Linux mirror server providing Mageia 4 via rsync? Modify the "url=" string for the version you want. This shows i586 which is the 32bit version. If you want the 64bit version it is: url=http://mirrors.mageia.org/api/mageia.4.x86_64.list; wget -q ${url} -O - | grep rsync: Show Sample Output


    1
    url=http://mirrors.mageia.org/api/mageia.4.i586.list; wget -q ${url} -O - | grep rsync:
    mpb · 2013-05-20 16:19:05 8
  • First (and only) argument should be a 4chan thread URL.


    1
    function 4chandl () { wget -e robots=off -nvcdp -t 0 -Hkrl 0 -I \*/src/ -P . "$1" }
    89r · 2013-07-28 11:29:53 7
  • Returns your external IP address to the command line using only wget Show Sample Output


    1
    wget http://ipecho.net/plain -O - -q ; echo
    JonathanFisher · 2013-10-02 21:18:40 10
  • Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :) wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)" Show Sample Output


    1
    wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us"|awk '/driverResults.aspx/ {print $4}'|cut -d "'" -f2|head -n 1)"|awk '/url=/ {print $2}'|cut -d '=' -f3|cut -d '&' -f1)"
    lowjax · 2013-11-21 03:04:59 13

  • 1
    read -p "Please enter the 4chan url: "|egrep '//i.4cdn.org/[a-z0-9]+/src/([0-9]*).(jpg|png|gif)' - -o|nl -s https:|cut -c7-|uniq|wget -nc -i - --random-wait
    unixmonkey73764 · 2014-03-09 05:56:14 7

  • 1
    wget -r -P ./dl/ -A jpg,jpeg http://captivates.com
    ferdous · 2014-06-14 17:28:32 8
  • No need to parse html page, website gives us a txt file :)


    1
    wget -qO- http://whatthecommit.com/index.txt | cowsay
    optyler · 2014-08-26 18:56:06 9
  • If the version already downloaded. it will not download again Show Sample Output


    1
    wget -N --content-disposition http://www.adminer.org/latest.php
    rickyok · 2014-09-12 07:52:45 9
  • ‹ First  < 4 5 6 7 8 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Schedule Nice Background Commands That Won't Die on Logout - Alternative to nohup and at
Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour. It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU. Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.

video volume boost

Insert a line at the top of a text file without sed or awk or bash loops
Yet another way to add a line at the top a of text file with the help of the tac command (reverse cat).

list block devices
Shows all block devices in a tree with descruptions of what they are.

Produce a pseudo random password with given length in base 64
Of course you will have to install Digest::SHA and perl before this will work :) Maximum length is 43 for SHA256. If you need more, use SHA512 or the hexadecimal form: sha256_hex()

Merge video files together using mencoder (part of mplayer)
Using mplayer's mencoder, you can merge video files together. '-oac' specifies the audio encoding (here copy, to just copy and not compress) '-ovc' specifies the video encoding (same thing).

Clean swap area after using a memory hogging application
When you run a memory intensive application (VirtualBox, large java application, etc) swap area is used as soon as memory becomes insufficient. After you close the program, the data in swap is not put back on memory and that decreases the responsiveness. Swapoff disables the swap area and forces system to put swap data be placed in memory. Since running without a swap area might be detrimental, swapon should be used to activate swap again. Both swapoff and swapon require root privileges.

shell bash iterate number range with for loop
Bash's arithmetic evaluation.

MySQL dump restore with progress bar and ETA
Display a progress bar while restoring a MySQL dump.

To find the uptime of each process-id of particular service or process


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: