Commands using wget (286)

  • wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" this does the actual google dictionary query, returns a JSON string encapsulated in some fancy tag sed 's/dict_api\.callbacks.id100.//' here we remove the tag beginning sed 's/,200,null)//' and here the tag end There are also some special characters which could cause problems with some JSON parsers, so if you get some errors, this is probably the case (sed is your friend). I laso like to trim the "webDefinitions" part, because it (sometimes) contains misleading information. sed 's/\,\"webDefinitions.*//' (but remember to append a "}" at the end, because the JSON string will be invalid) The output also contains links to mp3 files with pronounciation. As of now, this is only usable in the English language. If you choose other than English, you will only get webDefinitions (which are crap).


    1
    wget -qO - "http://www.google.com/dictionary/json?callback=dict_api.callbacks.id100&q=steering+wheel&sl=en&tl=en&restrict=pr,de&client=te" | sed 's/dict_api\.callbacks.id100.//' | sed 's/,200,null)//'
    sairon · 2011-03-08 15:00:39 16
  • put your link [url] to check if exist the remote file Show Sample Output


    1
    wget -O/dev/null -q URLtoCheck && echo exists || echo not exist
    xeonproject · 2011-04-07 20:55:33 3
  • This example command fetches 'example.com' webpage and then fetches+saves all PDF files listed (linked to) on that webpage. [*Note: of course there are no PDFs on example.com. This is just an example]


    1
    curl -s http://example.com | grep -o -P "<a.*href.*>" | grep -o "http.*.pdf" | xargs -d"\n" -n1 wget -c
    b_t · 2011-06-09 14:42:46 5

  • 1
    NAME=`wget --quiet URL -O - | grep util-vserver | tail -n 1 | sed 's|</a>.*||;s/.*>//'`; wget URL$UTILVSERVER;
    WMP · 2011-07-17 13:01:20 4
  • ?mirror : turn on options suitable for mirroring. -p : download all files that are necessary to properly display a given HTML page. ?convert-links : after the download, convert the links in document for local viewing. -P ./LOCAL-DIR : save all the files and directories to the specified directory.


    1
    $ wget --mirror -p --convert-links -P ./<LOCAL-DIR> <WEBSITE-URL>
    tkembo · 2011-08-18 08:27:28 6
  • Decrypt MD5 , replace 1cb251ec0d568de6a929b520c4aed8d1 with the MD5 string you want to decrypt Show Sample Output


    1
    wget -qO - --post-data "data[Row][cripted]=1cb251ec0d568de6a929b520c4aed8d1" http://md5-decrypter.com/ | grep -A1 "Decrypted text" | tail -n1 | cut -d '"' -f3 | sed 's/>//g; s/<\/b//g'
    samhagin · 2011-10-13 03:48:54 3

  • 1
    wget -A mp3,mpg,mpeg,avi -r -l 3 http://www.site.com/
    kev · 2011-11-09 10:06:07 3
  • Make your own MP3s from Youtube videos. Show Sample Output


    1
    url="put_url_here";audio=$(youtube-dl -s -e $url);wget -q -O - `youtube-dl -g $url`| ffmpeg -i - -f mp3 -vn -acodec libmp3lame - > "$audio.mp3"
    o0110o · 2011-11-15 19:09:52 4

  • 1
    wget -O chart.png 'http://chart.googleapis.com/chart?chs=250x100&chd=t:60,40&cht=p3&chl=Hello|World'
    kev · 2011-12-10 18:03:16 59

  • 1
    wget -nd -r -l 2 -A jpg,jpeg,png,gif http://website-url.com
    unixmonkey26318 · 2012-01-27 11:06:50 8
  • A simple script for download all the MegaTokyo strips from the first to the last one


    1
    for i in $(seq 1 `curl http://megatokyo.com 2>/dev/null|grep current|cut -f6 -d\"`);do wget http://megatokyo.com/`curl http://megatokyo.com/strip/${i} 2>/dev/null|grep src=\"strips\/|cut -f4 -d\"`;done
    akira88 · 2012-03-04 22:52:36 9

  • 1
    wget -m -k -K -E http://url/of/web/site
    joedistro · 2012-03-19 20:22:05 3

  • 1
    cat urls.txt | wget -i- -T 10 -t 3 --waitretry 1
    kev · 2012-05-14 06:41:14 9

  • 1
    while pgrep wget || sudo shutdown -P now; do sleep 1m; done
    kev · 2012-05-20 17:49:56 7
  • In this example, where the users gpg keyring has a password, the user will be interactively prompted for the keyring password. If the keyring has no password, same as above, sans the prompt. Suitable for cron jobs. ~/.gnupg/passwd/http-auth.gpg is the encrypted http auth password, for this particular wget use case. This approach has many use cases. example bash functions: function http_auth_pass() { gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null; } function decrypt_pass() { gpg2 --decrypt ~/.gnupg/passwd/"$1" 2>/dev/null; }


    1
    wget --input-file=~/donwloads.txt --user="$USER" --password="$(gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null)"
    kyle0r · 2012-12-13 00:14:55 6
  • If you have to deal with MS Sharepoint which is (rarely, let's hope) used in e.g. certain corporate environments). This uses Cntlm. For single files, just use cURL -- its NTLM authentication works quite well. # /etc/cntlm.conf: # Username account # Domain domain # Password ############ # Proxy 10.20.30.40 (IP of the sharepoint site) # NoProxy * # Listen 3128


    1
    http_proxy=http://127.0.0.1:3128 wget --http-user='domain\account' --http-password='###' -p -r -l 8 --no-remove-listing -P . 'http://sp.corp.com/teams/Team/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fteams%2fTeam%2fShared%20Documents%2fFolder'
    mhs · 2012-12-26 09:03:55 4
  • alias speedtest='wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip'


    1
    wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip
    opexxx · 2013-03-15 13:25:07 6
  • Need to find a Mageia Linux mirror server providing Mageia 4 via rsync? Modify the "url=" string for the version you want. This shows i586 which is the 32bit version. If you want the 64bit version it is: url=http://mirrors.mageia.org/api/mageia.4.x86_64.list; wget -q ${url} -O - | grep rsync: Show Sample Output


    1
    url=http://mirrors.mageia.org/api/mageia.4.i586.list; wget -q ${url} -O - | grep rsync:
    mpb · 2013-05-20 16:19:05 7
  • First (and only) argument should be a 4chan thread URL.


    1
    function 4chandl () { wget -e robots=off -nvcdp -t 0 -Hkrl 0 -I \*/src/ -P . "$1" }
    89r · 2013-07-28 11:29:53 6
  • Returns your external IP address to the command line using only wget Show Sample Output


    1
    wget http://ipecho.net/plain -O - -q ; echo
    JonathanFisher · 2013-10-02 21:18:40 9
  • Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :) wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)" Show Sample Output


    1
    wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us"|awk '/driverResults.aspx/ {print $4}'|cut -d "'" -f2|head -n 1)"|awk '/url=/ {print $2}'|cut -d '=' -f3|cut -d '&' -f1)"
    lowjax · 2013-11-21 03:04:59 11

  • 1
    read -p "Please enter the 4chan url: "|egrep '//i.4cdn.org/[a-z0-9]+/src/([0-9]*).(jpg|png|gif)' - -o|nl -s https:|cut -c7-|uniq|wget -nc -i - --random-wait
    unixmonkey73764 · 2014-03-09 05:56:14 6

  • 1
    wget -r -P ./dl/ -A jpg,jpeg http://captivates.com
    ferdous · 2014-06-14 17:28:32 7
  • No need to parse html page, website gives us a txt file :)


    1
    wget -qO- http://whatthecommit.com/index.txt | cowsay
    optyler · 2014-08-26 18:56:06 8
  • If the version already downloaded. it will not download again Show Sample Output


    1
    wget -N --content-disposition http://www.adminer.org/latest.php
    rickyok · 2014-09-12 07:52:45 8
  • ‹ First  < 4 5 6 7 8 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

send file to remote machine and unzip using ssh
This version transfers gzipped data which is unzipped as it arrives at the remote host.

Show bash's function definitions you defined in .bash_profile or .bashrc
If you omit the function name, the command will display all definitions

Create incremental backups of individual folders using find and tar-gzip
Problem: I wanted to backup user data individually, using and incremental method. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...) I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_2013-12-05.tgz" and "lucy_2013-12-05.tgz".

ssh -A user@somehost
the -A argument forwards your ssh private keys to the host you're going to. Useful in some scenarios where you have to hop to one server, and then login to another using a private key.

Display the end of a logfile as new lines are added to the end

Start another X session in a window
You might have Xnest (older) rather than Xephyr. You can experiment with other desktops eg: startx /usr/bin/start-kde -- /usr/bin/Xephyr :2 You can start X on a remote machine (although I'd recommend vnc for anything slower than a LAN): startx /usr/bin/ssh -X gnome-session -- /usr/bin/Xephyr :2 Or just start another X session locally talking to the remote backend:

Who invoked me? / Get parent command
Get the name of the parent command. This might be helpful, if you need to react on that information. E. g. a script called directly via ssh has got sshd as parent, manually invoked the parent process will probably be bash

en/decrypts files in a specific directory
To decrypt the files replace "ccenrypt" with "ccdecrypt. ccrypt(1) must be installed. It uses the AES (Rijndael) block cipher. To make it handier create an alias.

Replicate a directory structure dropping the files
Here is how to replicate the directory structure in the current directory to a destination directory (given by the variable DESTDIR), without copying the files.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: