Commands tagged curl (212)

  • curl ifconfig.me/ip -> IP Adress curl ifconfig.me/host -> Remote Host curl ifconfig.me/ua ->User Agent curl ifconfig.me/port -> Port thonks to http://ifconfig.me/


    276
    curl ifconfig.me
    aajjk · 2010-04-21 13:10:33 81
  • Change Seville for your prefered city. Show Sample Output


    48
    curl wttr.in/seville
    nordri · 2016-08-28 09:43:38 33
  • Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages. For some reason sed gets stuck on OS X, so here's a Perl version for the Mac: curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*<name>(.*)<\/name>.*$/$2 - $1/' If you want to see the name of the last person, who added a message to the conversation, change the greediness of the operators like this: curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*?<name>(.*?)<\/name>.*$/$2 - $1/' Show Sample Output


    47
    curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | sed -n "s/<title>\(.*\)<\/title.*name>\(.*\)<\/name>.*/\2 - \1/p"
    postrational · 2009-09-07 21:56:40 46

  • 26
    curl -Is slashdot.org | egrep '^X-(F|B|L)' | cut -d \- -f 2
    icco · 2009-03-23 19:58:10 18
  • Yeah I know it's been up here a million times, but this service is a really clean and nice one. Nothing but your IP address on it. Actually I was to write something like this, and noticed this on appspot... ;) Show Sample Output


    25
    curl ip.appspot.com
    ktoso · 2009-10-31 21:11:10 12
  • This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic. To get a random xkcd comic, I also use the following: xkcdrandom(){ wget -qO- dynamic.xkcd.com/comic/random|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}


    24
    xkcd(){ wget -qO- http://xkcd.com/|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
    eightmillion · 2009-11-27 09:11:47 22
  • This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';} If your version of grep doesn't have perl compatible regex support, then you can use this version: define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;} Show Sample Output


    18
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
    eightmillion · 2010-01-29 05:01:11 18
  • A bash function might also be useful: dict() { curl dict://dict.org/d:$1; } Or if you want less verbose output: dict() { curl -s dict://dict.org/d:$1 | perl -ne 's/\r//; last if /^\.$/; print if /^151/../^250/'; } Show Sample Output


    16
    curl dict://dict.org/d:something
    HorsePunchKid · 2009-04-10 18:12:37 11

  • 16
    check(){ curl -sI $1 | sed -n 's/Location: *//p';}
    putnamhill · 2010-09-30 12:29:02 8

  • 15
    curl -u username -o bookmarks.xml https://api.del.icio.us/v1/posts/all
    avi4now · 2009-04-06 13:54:15 23
  • An improvement of the original (at: http://www.commandlinefu.com/commands/view/2872/update-twitter-via-curl) in the sense that you see a "from cURL" under your status message instead of just a "from API" ;-) Twitter automatically links it to the cURL home page. Show Sample Output


    14
    curl -u twitter-username -d status="Hello World, Twitter!" -d source="cURL" http://twitter.com/statuses/update.xml
    MyTechieself · 2009-12-08 14:54:33 8
  • Required curl version >=7.21; using ~/.netrc for authorization


    14
    curl -n --ssl-reqd --mail-from "<user@gmail.com>" --mail-rcpt "<user@server.tld>" --url smtps://smtp.gmail.com:465 -T file.txt
    mitry · 2010-10-03 15:44:53 20
  • Validates and pretty-prints the content fetched from the URL. Show Sample Output


    13
    curl -s "http://feeds.delicious.com/v2/json?count=5" | python -m json.tool | less -R
    keimlink · 2010-03-24 09:15:12 21
  • Curl is not installed by default on many common distros anymore. wget always is :) wget -qO- ifconfig.me/ip


    12
    wget -qO- icanhazip.com
    SuperJediWombat · 2010-06-24 03:49:14 8
  • They are using json now Show Sample Output


    12
    curl -s http://www.census.gov/popclock/data/population/world | python -c 'import json,sys;obj=json.load(sys.stdin);print obj["world"]["population"]'
    mfr · 2013-07-27 08:00:10 65
  • Retrieve the current stock price from Yahoo Finance. The output is simply the latest price (which could be delayed). If you want to look up stock for a different company, replace csco with your symbol. Show Sample Output


    11
    curl -s 'http://download.finance.yahoo.com/d/quotes.csv?s=csco&f=l1'
    haivu · 2009-05-04 08:13:59 28

  • 11
    curl ifconfig.me
    dpoblador · 2010-10-09 08:12:26 5

  • 11
    curl -I http://localhost
    mniskin · 2011-01-02 14:19:30 11
  • Use `zless` to read the content of your *rss.gz file: zless commandlinefu-contribs-backup-2009-08-10-07.40.39.rss.gz Show Sample Output


    10
    curl http://www.commandlinefu.com/commands/by/<your username>/rss|gzip ->commandlinefu-contribs-backup-$(date +%Y-%m-%d-%H.%M.%S).rss.gz
    linuxrawkstar · 2009-08-10 12:43:33 38
  • This will log your internet download speed. You can run gnuplot -persist <(echo "plot 'bps' with lines") to get a graph of it.


    10
    echo $(date +%s) > start-time; URL=http://www.google.com; while true; do echo $(curl -L --w %{speed_download} -o/dev/null -s $URL) >> bps; sleep 10; done &
    matthewbauer · 2009-09-19 21:26:06 13
  • Not my script. Belongs to mathewbauer. Used without his permission. This script gives a single line as shown in the sample output. NOTE: I have blanked out the IP address for obvious security reasons. But you will get whatever is your IP if you run the script. Tested working in bash. Show Sample Output


    10
    curl -s "http://www.geody.com/geoip.php?ip=$(curl -s icanhazip.com)" | sed '/^IP:/!d;s/<[^>][^>]*>//g'
    getkaizer · 2009-11-04 07:15:02 11

  • 9
    curl -s 'http://checkip.dyndns.org' | sed 's/.*Current IP Address: \([0-9\.]*\).*/\1/g'
    kulor · 2009-08-06 11:54:31 10
  • Gets all kind of info, ifconfig.me rocks ... for just the ip addess you can use ifconfig.me or ifconfig.me/ip Show Sample Output


    9
    curl ifconfig.me/all
    pykler · 2012-05-16 18:22:28 8

  • 8
    urls=('www.ubuntu.com' 'google.com'); for i in ${urls[@]}; do http_code=$(curl -I -s $i -w %{http_code}); echo $i status: ${http_code:9:3}; done
    Code_Bleu · 2009-08-17 16:24:26 5
  • This function uploads images to http://omploader.org and then prints out the links to the file. Some coloring can also be added to the command with: ompload() { curl -F file1=@"$1" http://omploader.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;sub(/^/,"\033[0;34m");sub(/:/,"\033[0m:");print}';} Show Sample Output


    8
    ompload() { curl -# -F file1=@"$1" http://ompldr.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;print}';}
    eightmillion · 2009-11-07 20:56:52 5
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Run a command only when load average is below a certain threshold
Good for one off jobs that you want to run at a quiet time. The default threshold is a load average of 0.8 but this can be set using atrun.

Convert spaces in file names to underscores

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

list folders containing less than 2 MB of data
This command will search all subfolders of the current directory and list the names of the folders which contain less than 2 MB of data. I use it to clean up my mp3 archive and to delete the found folders pipe the output to a textfile & run: $ while read -r line; do rm -Rv "$line"; done < textfile

Burn CD/DVD from an iso, eject disc when finished.
cdrecord -scanbus will tell you the (x,y,z) value of your cdr (for example, mine is 3,0,0)

Run a command for blocks of output of another command
The given example collects output of the tail command: Whenever a line is emitted, further lines are collected, until no more output comes for one second. This group of lines is then sent as notification to the user. You can test the example with $ logger "First group"; sleep 1; logger "Second"; logger "group"

batch convert Nikon RAW (nef) images to JPG
converts RAW files from a Nikon DSLR to jpg for easy viewing etc. requires ufraw package

a function to create a box of '=' characters around a given string.
First argument: string to put a box around. Second argument: character to use for box (default is '=') Same as command #4948, but shorter, and without the utility function.

Backup all mysql databases to individual files on a remote server
It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.

use the previous commands params in the current command
Here the !!:1 will take the first parameter from the previous command. This can be used in conjunction with other history commands like ! and so on.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: