Commands tagged curl (212)

  • should be very consistent cause it's google :-)


    8
    curl -s ip.appspot.com
    tuxilicious · 2010-04-04 01:22:59 8

  • 7
    curl http://www.phrack.org/archives/tgz/phrack[1-67].tar.gz -o phrack#1.tar.gz
    amaymon · 2009-08-21 10:14:25 6
  • This bash function uses albumart.org to find the cover for an album. It returns an amazon.com url to the image. Usage: albumart [artist] [album] These arguments can be reversed and if the album name is distinct enough, it may be possible to omit the artist. The command can be extended with wget to automatically download the matching image like this: albumart(){ local x y="$@";x=$(awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music'));[ -z "$x" ]&&echo "Not found."||wget "$x" -O "${y}.${x##*.}";} Show Sample Output


    7
    albumart(){ local y="$@";awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music');}
    eightmillion · 2009-11-15 19:54:16 9

  • 7
    curl -s search.twitter.com | awk -F'</?[^>]+>' '/\/intra\/trend\//{print $2}'
    putnamhill · 2009-12-22 01:01:02 12
  • This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed). Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489


    7
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
    gthb · 2010-01-30 13:08:03 14
  • This shell function grabs the weather forecast for the next 24 to 48 hours from weatherunderground.com. Replace <YOURZIPORLOCATION> with your zip code or your "city, state" or "city, country", then calling the function without any arguments returns the weather for that location. Calling the function with a zip code or place name as an argument returns the weather for that location instead of your default. To add a bit of color formatting to the output, use the following instead: weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "\x1B[0;34m%s\x1B[0m: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';} Requires: perl, curl Show Sample Output


    7
    weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "%s: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}
    eightmillion · 2010-02-10 01:23:39 15
  • Here is the full function (got trunctated), which is much better and works for multiple queries. function cmdfu () { local t=~/cmdfu; until [[ -z $1 ]]; do echo -e "\n# $1 {{{1" >> $t; curl -s "commandlinefu.com/commands/matching/$1/`echo -n $1|base64`/plaintext" | sed '1,2d;s/^#.*/& {{{2/g' | tee -a $t > $t.c; sed -i "s/^# $1 {/# $1 - `grep -c '^#' $t.c` {/" $t; shift; done; vim -u /dev/null -c "set ft=sh fdm=marker fdl=1 noswf" -M $t; rm $t $t.c } Searches commandlinefu for single/multiple queries and displays syntax-highlighted, folded, and numbered results in vim. Show Sample Output


    7
    cmdfu(){ local t=~/cmdfu;echo -e "\n# $1 {{{1">>$t;curl -s "commandlinefu.com/commands/matching/$1/`echo -n $1|base64`/plaintext"|sed '1,2d;s/^#.*/& {{{2/g'>$t;vim -u /dev/null -c "set ft=sh fdm=marker fdl=1 noswf" -M $t;rm $t; }
    AskApache · 2012-02-21 05:43:16 11

  • 7
    curl -C - -o partially_downloaded_file 'www.example.com/path/to/the/file'
    weldabar · 2012-11-05 17:14:16 18
  • Share your "now playing" Amarok song in twitter!


    6
    curl -u <user>:<password> -d status="Amarok, now playing: $(dcop amarok default nowPlaying)" http://twitter.com/statuses/update.json
    caiosba · 2009-06-14 02:42:34 6
  • I took matthewbauer's cool one-liner and rewrote it as a shell function that returns all the suggestions or outputs "OK" if it doesn't find anything wrong. It should work on ksh, zsh, and bash. Users that don't have tee can leave that part off like this: spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://google.com/tbproxy/spell|sed -n '/s="[1-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}';} Show Sample Output


    6
    spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://www.google.com/tbproxy/spell|sed -n '/s="[0-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}'|tee >(grep -Eq '.*'||echo -e "OK");}
    eightmillion · 2010-02-17 08:20:48 17

  • 6
    eog `curl -s http://xkcd.com/ | sed -n 's/<h3>Image URL.*: \(.*\)<\/h3>/\1/p'`
    bluesman · 2010-08-31 13:23:21 5
  • Use curl and sed to shorten an URL using goo.gl without any other api Show Sample Output


    6
    curl -s -d'&url=URL' http://goo.gl/api/url | sed -e 's/{"short_url":"//' -e 's/","added_to_history":false}/\n/'
    Soubsoub · 2010-10-01 23:20:08 10

  • 6
    curl -I -H "Accept-Encoding: gzip,deflate" http://example.org
    totti · 2011-08-16 10:32:01 4

  • 6
    curl -u $USERNAME:$PASSWORD "http://dynupdate.no-ip.com/nic/update?hostname=$HOSTNAME"
    drerik · 2012-08-16 05:45:03 6
  • tracing redirects for a given url shortener Show Sample Output


    6
    curl --silent -I -L shorturl.at/dfIJQ | grep -i location
    aysadk · 2022-09-04 19:31:46 717

  • 5
    curl -s http://whatthecommit.com/index.txt | cowsay
    adaven · 2011-03-20 17:30:25 4
  • curl(1) is more portable than wget(1) across Unices, so here is an alternative doing the same thing with greater portability. This shell function uses curl(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from: expandurl http://t.co/LDWqmtDM Show Sample Output


    5
    expandurl() { curl -sIL $1 | grep ^Location; }
    atoponce · 2011-10-19 00:56:53 13

  • 5
    curl -s http://www.census.gov/popclock/data/population/world | awk -F'[:,]' '{print $7}'
    zlemini · 2013-07-28 00:31:30 9
  • Bash process substitution which curls the website 'hashbang.sh' and executes the shell script embedded in the page. This is obviously not the most secure way to run something like this, and we will scold you if you try. The smarter way would be: Download locally over SSL > curl https://hashbang.sh >> hashbang.sh Verify integrty with GPG (If available) > gpg --recv-keys 0xD2C4C74D8FAA96F5 > gpg --verify hashbang.sh Inspect source code > less hashbang.sh Run > chmod +x hashbang.sh > ./hashbang.sh


    5
    sh <(curl hashbang.sh)
    lrvick · 2015-03-15 21:02:01 16

  • 4
    curl http://domain.com/file.tar.gz | tar zx
    mkoga · 2009-03-24 04:41:09 6

  • 4
    curl --form username=from_twitter --form password=from_twitter --form media=@/path/to/image --form-string "message=tweet" http://twitpic.com/api/uploadAndPost
    baergaj · 2009-04-27 15:57:04 11
  • identica is an open source social networking and micro-blogging service. Based on Laconica, a micro-blogging software package built on the OpenMicroBlogging specification. http://identi.ca/


    4
    curl -u USER:PASS -d status="NEW STATUS" http://identi.ca/api/statuses/update.xml
    unixmonkey3754 · 2009-05-15 19:57:00 82
  • The curl command retrieve the HTML text containing the IP address. The grep command picks out the IP address from that HTML text. Show Sample Output


    4
    curl -s checkip.dyndns.org | grep -Eo '[0-9\.]+'
    haivu · 2009-05-21 16:12:21 8
  • (Apparently it is too long so I put it in sample output, I hope that is OK.) Run the long command (or put it in your .bashrc) in sample output then run: fbemailscraper YourFBEmail Password Voila! Your contacts' emails will appear. Facebook seems to have gotten rid of the picture encoding of emails and replaced it with a text based version making it easy to scrape! Needs curl to run and it was made pretty quickly so there might be bugs. Show Sample Output


    4
    fbemailscraper YourFBEmail Password
    dabom · 2010-01-31 00:44:35 45
  • Get your external ip adress thanks to http://www.icanhazip.com


    4
    curl -s icanhazip.com
    thelan · 2010-04-03 11:11:34 4
  •  < 1 2 3 4 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Replicate a directory structure dropping the files

diff files while disregarding indentation and trailing white space
**NOTE** Tekhne's alternative is much more succinct and its output conforms to the files actual contents rather than with white space removed My command on the other hand uses bash process substitution (and "Minimal" Perl), instead of files, to first remove leading and trailing white space from lines, before diff'ing the streams. Very useful when differences in indentation, such as in programming source code files, may be irrelevant

find out how many days since given date
You can also do this for seconds, minutes, hours, etc... Can't use dates before the epoch, though.

Graphically show percent of mount space used
Automatically drops mount points that have non-numeric sizes (e.g. /proc). Tested in bash on Linux and AIX.

Adequately order the page numbers to print a booklet
Useful if you don't have at hand the ability to automatically create a booklet, but still want to. F is the number of pages to print. It *must* be a multiple of 4; append extra blank pages if needed. In evince, these are the steps to print it, adapted from https://help.gnome.org/users/evince/stable/duplex-npage.html.en : 1) Click File ▸ Print. 2) Choose the General tab. Under Range, choose Pages. Type the numbers of the pages in this order (this is what this one-liner does for you): n, 1, 2, n-1, n-2, 3, 4, n-3, n-4, 5, 6, n-5, n-6, 7, 8, n-7, n-8, 9, 10, n-9, n-10, 11, 12, n-11... ...until you have typed n-number of pages. 3) Choose the Page Setup tab. - Assuming a duplex printer: Under Layout, in the Two-side menu, select Short Edge (Flip). - If you can only print on one side, you have to print twice, one for the odd pages and one for the even pages. In the Pages per side option, select 2. In the Page ordering menu, select Left to right. 4) Click Print.

Sort files by date
Show you the list of files of current directory sorted by date youngest to oldest, remove the 'r' if you want it in the otherway.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

reverse-i-search: Search through your command line history
"What it actually shows is going to be dependent on the commands you've previously entered. When you do this, bash looks for the last command that you entered that contains the substring "ls", in my case that was "lsof ...". If the command that bash finds is what you're looking for, just hit Enter to execute it. You can also edit the command to suit your current needs before executing it (use the left and right arrow keys to move through it). If you're looking for a different command, hit Ctrl+R again to find a matching command further back in the command history. You can also continue to type a longer substring to refine the search, since searching is incremental. Note that the substring you enter is searched for throughout the command, not just at the beginning of the command." - http://www.linuxjournal.com/content/using-bash-history-more-efficiently

Get the list of local files that changed since their last upload in an S3 bucket
Can be useful to granulary flush files in a CDN after they've been changed in the S3 bucket.

print DateTimeOriginal from EXIF data for all files in folder
see output from `identify -verbose` for other keywords to filter for (e.g. date:create, exif:DateTime, EXIF:ExifOffset).


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: