Commands tagged curl (194)

  • Shorter regex. Show Sample Output

    shout() { curl -s "${1}" | sed -n "/<h1>/s/.*href=\"\([^\"]*\)\".*/\1/p" ;}
    dabom · 2010-10-05 19:15:50 1
  • Just add this function to your .zshrc / .bashrc, and by typing "shout *URL*" you get a randomly chosen English word that uses to short your URL. You may now go to*output_word* and get redirected. The URL will be valid for 5 minutes. (I've never used sed before, so I'll be quite glad if someone could straighten up the sed commands and combine them (perhaps also removing the whitespace). If so, I'll update it right away ;) ) Show Sample Output

    shout () { curl -s "$1" | sed -n 's/\<h1\>/\&/p' | sed 's/<[^>]*>//g;/</N;//b' ;}
    elfreak · 2010-10-04 23:50:54 0
  • Required curl version >=7.21; using ~/.netrc for authorization

    curl -n --ssl-reqd --mail-from "<>" --mail-rcpt "<user@server.tld>" --url smtps:// -T file.txt
    mitry · 2010-10-03 15:44:53 16
  • Shorter and made into a function. Show Sample Output

    googl () { curl -s -d "url=${1}" | sed -n "s/.*:\"\([^\"]*\).*/\1\n/p" ;}
    dabom · 2010-10-03 02:52:44 0
  • Use curl and sed to shorten an URL using without any other api Show Sample Output

    curl -s -d'&url=URL' | sed -e 's/{"short_url":"//' -e 's/","added_to_history":false}/\n/'
    Soubsoub · 2010-10-01 23:20:08 2
  • Each file in the current folder is uploaded to If the folder contains other filetypes change: for files in * to: for files in *.jpg (to upload ONLY .jpg files) Additionally you can try (results may vary): for files in *.jpg *.png The output URL is encased with BB image tags for use in a forum. Show Sample Output

    imageshack() { for files in *; do curl -H Expect: -F fileupload="@$files" -F xml=yes -# "" | grep image_link | sed -e 's/<image_link>/[IMG]/g' -e 's/<\/image_link>/[\/IMG]/g'; done; }
    operatinghazard · 2010-10-01 06:50:04 0

  • 15
    check(){ curl -sI $1 | sed -n 's/Location: *//p';}
    putnamhill · 2010-09-30 12:29:02 1
  • We can put this inside a function: fxray() { curl -s"$1" | grep -o '<title>.*</title>' | sed 's/<title>.*--> \(.*\)<\/title>/\1/g'; }; fxray Show Sample Output

    curl -s | grep -o '<title>.*</title>' | sed 's/<title>.*--> \(.*\)<\/title>/\1/g'
    karpoke · 2010-09-30 10:25:18 1

  • 6
    eog `curl -s | sed -n 's/<h3>Image URL.*: \(.*\)<\/h3>/\1/p'`
    bluesman · 2010-08-31 13:23:21 0
  • KISS To get a random xkcd comic: xdg-open

    unixmonkey8024 · 2010-08-25 19:14:11 0
  • This function displays the latest comic from One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic. To get a random xkcd comic use the following: xkcdrandom() { wget -qO- | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash; } These are just a bit shorter than the ones eigthmillion wrote, however his version didn't work as expected on my laptop for some reason (I got the title-tag first), so these build a command which is executed by bash.

    xkcd() { wget -qO- | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash ; }
    John_W · 2010-08-25 15:44:31 3
  • Shorter version with curl and awk

    eog `curl '' | awk -F "ng): |</h" '/embedding/{print $2}'`
    dog · 2010-08-25 14:04:30 0
  • Output the html from xkcd's index.html, filter out the html tags, and then view it in gwenview. Show Sample Output

    gwenview `wget -O - | grep 'png' | grep '<img src="' | sed s/title=\".*//g | sed 's/.png\"/.png/g' | sed 's/<img src=\"//g'`
    hunterm · 2010-08-24 22:21:51 1
  • Splitting on tags in awk is a handy way to parse html.

    curl -sL '' | awk -F'</?[^>]+>' '/"command"/{print $2}'
    putnamhill · 2010-08-13 11:42:42 0
  • Gets the latest podcast show from from your favorite Podcast. Uses curl and xmlstarlet. Make sure you change out the items between brackets.

    curl -L -s `curl -s []` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
    denzuko · 2010-07-31 00:17:47 0
  • Ever wanted to stream your favorite podcast across the network, well now you can. This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption. Show Sample Output

    curl -L -s `curl -s` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
    denzuko · 2010-07-30 23:20:50 0
  • Aside from curl one will need iconv windows binary since windows lacks a native utf-8 cli interface. In my case I need a proxy in China and iconv to convert gbk status string into utf-8. GnuWin32 is a good choice with loads of coreutils natively ported to Windows "FOR /f" is the solution to pass iconv output to curl.

    FOR /f %%g in ('echo %1 ^| iconv -f gbk -t utf-8') DO curl -x proxy:port -u user:pass -d status=%%g -d source="cURL"
    MeaCulpa · 2010-07-21 04:53:54 0
  • Query the Socrata Open Data API being used by the White House to find any employee's salary using curl, grep and awk. Change the value of the search parameter (example uses Axelrod) to the name of any White House staffer to see their annual salary. Show Sample Output

    curl -s "" | grep "data\" :" | awk '{ print $17 }'
    mheadd · 2010-07-01 23:54:54 2
  • Curl is not installed by default on many common distros anymore. wget always is :) wget -qO-

    wget -qO-
    SuperJediWombat · 2010-06-24 03:49:14 1
  • If curl isn't available, use lynx.

    lynx --dump
    111110100 · 2010-06-16 11:52:50 0
  • With a lolcat favicon if you access it from your browser Show Sample Output

    pykler · 2010-06-14 18:47:11 0

  • 2
    curl -s "|en&v=1.0&q=`xsel`" |cut -d \" -f 6
    eneko · 2010-06-11 21:38:26 0
  • Instead of having someone else read you the Digg headlines, Have OSX do it. Requires Curl+Sed+Say. This could probably be easily modified to use espeak for Linux.

    IFS=`echo -en "\n\b"`; for i in $(curl | grep '<title>' | sed -e 's#<[^>]*>##g' | tail -n10); do echo $i; echo $i | sed 's/^/Did you hear about /g' | say; sleep 30; done
    echosedawk · 2010-06-07 22:16:19 1
  • In this example we search for 'vim' but vim doesn't have a project on github right now. That's ok, this command still searches for every project that has 'vim' in their description (forks, plugins, etc). To get XML or JSON output just replace 'yaml' in the url with 'xml' or 'json'. Show Sample Output

    rkulla · 2010-05-30 00:29:03 0
  • In this example 'git' is the user name and the output format is YAML but you can change this to XML or JSON, eg: curl Show Sample Output

    rkulla · 2010-05-30 00:18:00 0
  • ‹ First  < 3 4 5 6 7 >  Last ›

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: