Commands matching curl (616)


  • 14
    curl -s https://api.github.com/users/<username>/repos?per_page=1000 |grep git_url |awk '{print $2}'| sed 's/"\(.*\)",/\1/'
    wuseman1 · 2019-11-19 20:31:19 261

  • 13
    URL="http://www.google.com";curl -L --w "$URL\nDNS %{time_namelookup}s conn %{time_connect}s time %{time_total}s\nSpeed %{speed_download}bps Size %{size_download}bytes\n" -o/dev/null -s $URL
    adminix · 2009-08-28 12:30:56 8
  • Validates and pretty-prints the content fetched from the URL. Show Sample Output


    13
    curl -s "http://feeds.delicious.com/v2/json?count=5" | python -m json.tool | less -R
    keimlink · 2010-03-24 09:15:12 21
  • Useful script to backup all your delicious bookmarks. With decilicious shutting down soon , it could be useful Show Sample Output


    13
    curl --user login:password -o DeliciousBookmarks.xml -O 'https://api.del.icio.us/v1/posts/all'
    nco · 2010-12-17 03:35:09 11
  • Put it in your ~/.bashrc usage: google word1 word2 word3... google '"this search gets quoted"' Show Sample Output


    13
    function google { Q="$@"; GOOG_URL='https://www.google.de/search?tbs=li:1&q='; AGENT="Mozilla/4.0"; stream=$(curl -A "$AGENT" -skLm 10 "${GOOG_URL}${Q//\ /+}" | grep -oP '\/url\?q=.+?&amp' | sed 's|/url?q=||; s|&amp||'); echo -e "${stream//\%/\x}"; }
    michelsberg · 2013-04-05 08:04:15 9
  • I tried out on my Mac, jot to generate sequence ( 0,25,50,..), you can use 'seq' if it is linux to generate numbers, need curl installed on the machine, then it rocks. @Satya Show Sample Output


    12
    for x in `jot - 0 2400 25`; do curl "http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/$x" ; done > commandlinefu.txt
    satyavvd · 2009-07-23 12:04:02 23
  • Curl is not installed by default on many common distros anymore. wget always is :) wget -qO- ifconfig.me/ip


    12
    wget -qO- icanhazip.com
    SuperJediWombat · 2010-06-24 03:49:14 8
  • They are using json now Show Sample Output


    12
    curl -s http://www.census.gov/popclock/data/population/world | python -c 'import json,sys;obj=json.load(sys.stdin);print obj["world"]["population"]'
    mfr · 2013-07-27 08:00:10 65
  • Retrieve the current stock price from Yahoo Finance. The output is simply the latest price (which could be delayed). If you want to look up stock for a different company, replace csco with your symbol. Show Sample Output


    11
    curl -s 'http://download.finance.yahoo.com/d/quotes.csv?s=csco&f=l1'
    haivu · 2009-05-04 08:13:59 28
  • Doesn't require password (asks for it instead) Show Sample Output


    11
    curl -u user -d status="Tweeting from the shell" http://twitter.com/statuses/update.xml
    matthewbauer · 2009-08-05 02:24:01 14
  • The original doesn't work for me - but this does. I'm guessing that Youtube updated the video page so the original doesn't work.


    11
    id="dMH0bHeiRNg";mplayer -fs http://youtube.com/get_video.php?video_id=$id\&t=$(curl -s http://www.youtube.com/watch?v=$id | sed -n 's/.*, "t": "\([^"]*\)", .*/\1/p')
    matthewbauer · 2009-08-13 14:16:01 7

  • 11
    curl -I http://localhost
    mniskin · 2011-01-02 14:19:30 11

  • 11
    curl ifconfig.me
    dpoblador · 2010-10-09 08:12:26 5
  • My key is the anonymous one, is good for 50 post an hour with a maximun number of uploads a day, probably will run out, if that happend you can get a free key at the site. Show Sample Output


    11
    imgur(){ $*|convert label:@- png:-|curl -F "image=@-" -F "key=1913b4ac473c692372d108209958fd15" http://api.imgur.com/2/upload.xml|grep -Eo "<original>(.)*</original>" | grep -Eo "http://i.imgur.com/[^<]*";}
    dzup · 2011-09-23 05:42:58 14
  • Pump up the chatter, run this script on a regular basis to listen to your twitter timeline. This is a rough first cut using several cli clips I have spotted around. There is no facility to not read those things already read to you. This could also easily be put in a loop for timed onslaught from the chatterverse, though I think it might violate several pointsof the Geneva Convention UPDATE - added a loop, only reads the first 6 twits, and does this every 5 mins. Show Sample Output


    10
    while [ 1 ]; do curl -s -u username:password http://twitter.com/statuses/friends_timeline.rss|grep title|sed -ne 's/<\/*title>//gp' | head -n 6 |festival --tts; sleep 300;done
    tomwsmf · 2009-02-20 20:20:21 17
  • Use `zless` to read the content of your *rss.gz file: zless commandlinefu-contribs-backup-2009-08-10-07.40.39.rss.gz Show Sample Output


    10
    curl http://www.commandlinefu.com/commands/by/<your username>/rss|gzip ->commandlinefu-contribs-backup-$(date +%Y-%m-%d-%H.%M.%S).rss.gz
    linuxrawkstar · 2009-08-10 12:43:33 38

  • 10
    curl http://www.whatismyip.org/
    timothybeeler · 2009-10-26 17:40:56 9
  • Not my script. Belongs to mathewbauer. Used without his permission. This script gives a single line as shown in the sample output. NOTE: I have blanked out the IP address for obvious security reasons. But you will get whatever is your IP if you run the script. Tested working in bash. Show Sample Output


    10
    curl -s "http://www.geody.com/geoip.php?ip=$(curl -s icanhazip.com)" | sed '/^IP:/!d;s/<[^>][^>]*>//g'
    getkaizer · 2009-11-04 07:15:02 11
  • This will log your internet download speed. You can run gnuplot -persist <(echo "plot 'bps' with lines") to get a graph of it.


    10
    echo $(date +%s) > start-time; URL=http://www.google.com; while true; do echo $(curl -L --w %{speed_download} -o/dev/null -s $URL) >> bps; sleep 10; done &
    matthewbauer · 2009-09-19 21:26:06 13
  • Requires aria2c but could just as easily wget or anything else. A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-) Show Sample Output


    10
    d="www.dafont.com/alpha.php?";for c in {a..z}; do l=`curl -s "${d}lettre=${c}"|sed -n 's/.*ge=\([0-9]\{2\}\).*/\1/p'`;for((p=1;p<=l;p++));do for u in `curl -s "${d}page=${p}&lettre=${c}"|egrep -o "http\S*.com/dl/\?f=\w*"`;do aria2c "${u}";done;done;done
    lrvick · 2010-05-18 07:38:54 4
  • *** CAREFULLY READ THE NOTES **** *** THIS DOES NOT WORK "OUT OF THE BOX" *** You'll need a few minutes of CAREFUL reading before making your own Twitter feed: In 2010 simple command line Twitter feed requests all stopped working because Twitter upgraded to SSL security. Https requests for a filtered Twitter stream feed now require a special header called "oauth_header". The benefit is that your stream feed and login info is securely encrypted. The bad news is that an "oauth_header" takes some work to build. Fortunately, four functions, imaginatively named step1, step2, step3 and step4 can be used to build a customized oauth_header for you in a few minutes. Now, go look at "step1" to start creating your own oauth_header! Show Sample Output


    10
    step1 ; step2 ; step3 ; step4 ; curl -o- --get 'https://stream.twitter.com/1/statuses/filter.json' --header "$oauth_header" --data "follow=$id"
    nixnax · 2012-03-18 21:15:04 6

  • 9
    curl -sd q=Network http://www.commandlinefu.com/search/autocomplete |html2text -width 100
    commandlinefu · 2009-07-09 00:57:28 10
  • Assume that you have a form , in the source look for something similar to : input name="rid" type="TEXT" input name="submit" value="SUBMIT" type="SUBMIT" align="center" Then exec the command to get the response into html More info : www.h3manth.com


    9
    curl -sd 'rid=value&submit=SUBMIT' <URL> > out.html
    unixmonkey4697 · 2009-07-17 03:51:00 4

  • 9
    curl -s 'http://checkip.dyndns.org' | sed 's/.*Current IP Address: \([0-9\.]*\).*/\1/g'
    kulor · 2009-08-06 11:54:31 10
  • Changes the wallpaper for the last IR picture of the sun taken by SOHO satellite. Lesser size, try curl http://sohowww.nascom.nasa.gov/data/realtime/eit_195/512/latest.jpg | xli -onroot -fullscreen -xzoom 120 -yzoom 120 -border black stdin I use inside kalarm(kde), updating every 15 minutes needs xli , curl


    9
    curl http://sohowww.nascom.nasa.gov/data/realtime/eit_195/512/latest.jpg | xli -onroot -fill stdin
    m33600 · 2009-11-12 22:00:19 5
  •  < 1 2 3 4 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Change/Modify timestamp

Rename files in batch

List your installed Chromium extensions (with url to each page)
Gives you a list for all installed chrome (chromium) extensions with URL to the page of the extension. With this you can easy add a new Bookmark folder called "extensions" add every URL to that folder, so it will be synced and you can access the names from every computer you are logged in. ------------------------------------------------------------------------------------------------------------------ Only tested with chromium, for chrome you maybe have to change the find $PATH.

no # comments, blank lines, white space. # can start in any column
The shortest and most complete comment/blank line remover... Any line where the first non-whitespace character is # (ie, indented # comments), and all null and blank lines are removed. Use the alias as a filter: $ noc /etc/hosts or $ grep server /etc/hosts | noc Change to nawk depending awk versions.

find and reduce 8x parallel the size of JPG images without loosing quality via jpegoptim

Sort dotted quads
Sort a list of IPV4 addresses in numerical order. Great as a filter, or within vim using !}

convert .bin / .cue into .iso image
bchunk [-v] [-p] [-r] [-w] [-s]

Multiplication table
The multiplication table for math study

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: