Commands tagged curl (212)

  • should be very consistent cause it's google :-)


    8
    curl -s ip.appspot.com
    tuxilicious · 2010-04-04 01:22:59 8

  • 7
    curl http://www.phrack.org/archives/tgz/phrack[1-67].tar.gz -o phrack#1.tar.gz
    amaymon · 2009-08-21 10:14:25 6
  • This bash function uses albumart.org to find the cover for an album. It returns an amazon.com url to the image. Usage: albumart [artist] [album] These arguments can be reversed and if the album name is distinct enough, it may be possible to omit the artist. The command can be extended with wget to automatically download the matching image like this: albumart(){ local x y="$@";x=$(awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music'));[ -z "$x" ]&&echo "Not found."||wget "$x" -O "${y}.${x##*.}";} Show Sample Output


    7
    albumart(){ local y="$@";awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music');}
    eightmillion · 2009-11-15 19:54:16 11

  • 7
    curl -s search.twitter.com | awk -F'</?[^>]+>' '/\/intra\/trend\//{print $2}'
    putnamhill · 2009-12-22 01:01:02 12
  • This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed). Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489


    7
    define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
    gthb · 2010-01-30 13:08:03 14
  • This shell function grabs the weather forecast for the next 24 to 48 hours from weatherunderground.com. Replace <YOURZIPORLOCATION> with your zip code or your "city, state" or "city, country", then calling the function without any arguments returns the weather for that location. Calling the function with a zip code or place name as an argument returns the weather for that location instead of your default. To add a bit of color formatting to the output, use the following instead: weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "\x1B[0;34m%s\x1B[0m: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';} Requires: perl, curl Show Sample Output


    7
    weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "%s: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}
    eightmillion · 2010-02-10 01:23:39 16
  • Here is the full function (got trunctated), which is much better and works for multiple queries. function cmdfu () { local t=~/cmdfu; until [[ -z $1 ]]; do echo -e "\n# $1 {{{1" >> $t; curl -s "commandlinefu.com/commands/matching/$1/`echo -n $1|base64`/plaintext" | sed '1,2d;s/^#.*/& {{{2/g' | tee -a $t > $t.c; sed -i "s/^# $1 {/# $1 - `grep -c '^#' $t.c` {/" $t; shift; done; vim -u /dev/null -c "set ft=sh fdm=marker fdl=1 noswf" -M $t; rm $t $t.c } Searches commandlinefu for single/multiple queries and displays syntax-highlighted, folded, and numbered results in vim. Show Sample Output


    7
    cmdfu(){ local t=~/cmdfu;echo -e "\n# $1 {{{1">>$t;curl -s "commandlinefu.com/commands/matching/$1/`echo -n $1|base64`/plaintext"|sed '1,2d;s/^#.*/& {{{2/g'>$t;vim -u /dev/null -c "set ft=sh fdm=marker fdl=1 noswf" -M $t;rm $t; }
    AskApache · 2012-02-21 05:43:16 11

  • 7
    curl -C - -o partially_downloaded_file 'www.example.com/path/to/the/file'
    weldabar · 2012-11-05 17:14:16 18
  • Share your "now playing" Amarok song in twitter!


    6
    curl -u <user>:<password> -d status="Amarok, now playing: $(dcop amarok default nowPlaying)" http://twitter.com/statuses/update.json
    caiosba · 2009-06-14 02:42:34 6
  • I took matthewbauer's cool one-liner and rewrote it as a shell function that returns all the suggestions or outputs "OK" if it doesn't find anything wrong. It should work on ksh, zsh, and bash. Users that don't have tee can leave that part off like this: spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://google.com/tbproxy/spell|sed -n '/s="[1-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}';} Show Sample Output


    6
    spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://www.google.com/tbproxy/spell|sed -n '/s="[0-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}'|tee >(grep -Eq '.*'||echo -e "OK");}
    eightmillion · 2010-02-17 08:20:48 17

  • 6
    eog `curl -s http://xkcd.com/ | sed -n 's/<h3>Image URL.*: \(.*\)<\/h3>/\1/p'`
    bluesman · 2010-08-31 13:23:21 5
  • Use curl and sed to shorten an URL using goo.gl without any other api Show Sample Output


    6
    curl -s -d'&url=URL' http://goo.gl/api/url | sed -e 's/{"short_url":"//' -e 's/","added_to_history":false}/\n/'
    Soubsoub · 2010-10-01 23:20:08 10

  • 6
    curl -I -H "Accept-Encoding: gzip,deflate" http://example.org
    totti · 2011-08-16 10:32:01 4

  • 6
    curl -u $USERNAME:$PASSWORD "http://dynupdate.no-ip.com/nic/update?hostname=$HOSTNAME"
    drerik · 2012-08-16 05:45:03 6
  • tracing redirects for a given url shortener Show Sample Output


    6
    curl --silent -I -L shorturl.at/dfIJQ | grep -i location
    aysadk · 2022-09-04 19:31:46 735

  • 5
    curl -s http://whatthecommit.com/index.txt | cowsay
    adaven · 2011-03-20 17:30:25 4
  • curl(1) is more portable than wget(1) across Unices, so here is an alternative doing the same thing with greater portability. This shell function uses curl(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from: expandurl http://t.co/LDWqmtDM Show Sample Output


    5
    expandurl() { curl -sIL $1 | grep ^Location; }
    atoponce · 2011-10-19 00:56:53 13

  • 5
    curl -s http://www.census.gov/popclock/data/population/world | awk -F'[:,]' '{print $7}'
    zlemini · 2013-07-28 00:31:30 9
  • Bash process substitution which curls the website 'hashbang.sh' and executes the shell script embedded in the page. This is obviously not the most secure way to run something like this, and we will scold you if you try. The smarter way would be: Download locally over SSL > curl https://hashbang.sh >> hashbang.sh Verify integrty with GPG (If available) > gpg --recv-keys 0xD2C4C74D8FAA96F5 > gpg --verify hashbang.sh Inspect source code > less hashbang.sh Run > chmod +x hashbang.sh > ./hashbang.sh


    5
    sh <(curl hashbang.sh)
    lrvick · 2015-03-15 21:02:01 16

  • 4
    curl http://domain.com/file.tar.gz | tar zx
    mkoga · 2009-03-24 04:41:09 6

  • 4
    curl --form username=from_twitter --form password=from_twitter --form media=@/path/to/image --form-string "message=tweet" http://twitpic.com/api/uploadAndPost
    baergaj · 2009-04-27 15:57:04 11
  • identica is an open source social networking and micro-blogging service. Based on Laconica, a micro-blogging software package built on the OpenMicroBlogging specification. http://identi.ca/


    4
    curl -u USER:PASS -d status="NEW STATUS" http://identi.ca/api/statuses/update.xml
    unixmonkey3754 · 2009-05-15 19:57:00 82
  • The curl command retrieve the HTML text containing the IP address. The grep command picks out the IP address from that HTML text. Show Sample Output


    4
    curl -s checkip.dyndns.org | grep -Eo '[0-9\.]+'
    haivu · 2009-05-21 16:12:21 8
  • (Apparently it is too long so I put it in sample output, I hope that is OK.) Run the long command (or put it in your .bashrc) in sample output then run: fbemailscraper YourFBEmail Password Voila! Your contacts' emails will appear. Facebook seems to have gotten rid of the picture encoding of emails and replaced it with a text based version making it easy to scrape! Needs curl to run and it was made pretty quickly so there might be bugs. Show Sample Output


    4
    fbemailscraper YourFBEmail Password
    dabom · 2010-01-31 00:44:35 45
  • Get your external ip adress thanks to http://www.icanhazip.com


    4
    curl -s icanhazip.com
    thelan · 2010-04-03 11:11:34 4
  •  < 1 2 3 4 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

convert a line to a space

yum -q list updates | tail -n+2

Show a prettified list of nearby wireless APs

Immediately put execute permission on any file saved/created in $HOME/bin

Remove grep itself from ps
When you 'ps|grep' for a given process, it turns out that grep itself appears as a valid line since it contains the RE/name you are looking for. To avoid grep from showing itself, simply insert some wildcard into process' name.

Losslessly rotate videos from your phone by 90 degrees.
Takes all the .3gp files in the directory, rotates them by 90 degrees, and saves them in the lossless ffv1 encoding. If this rotates in the wrong direction, you may want transponse=1 Re-encoding to ffv1 may result in a significant increase in file size, as it is a lossless format. Other applications may not recognize ffv1 if they don't use ffmpeg code. "huffyuv" might be another option for lossless saving of your transformations. The audio may be re-encoded as well, if the encoding used by your 3gp file doesn't work in a avi container.

print file without duplicated lines using awk
This create an array 'a' with wole lines. only one occurrence of each line - Not Get lines ++ !

Write a listing of all directories and files on the computer to a compressed file.
This command is meant to be used to make a lightweight backup, for when you want to know which files might be missing or changed, but you don't care about their contents (because you have some way to recover them). Explanation of parts: "ls -RFal /" lists all files in and below the root directory, along with their permissions and some other metadata. I think sudo is necessary to allow ls to read the metadata of certain files. "| gzip" compresses the result, from 177 MB to 16 MB in my case. "> all_files_list.txt.gz" saves the result to a file in the current directory called all_files_list.txt.gz. This name can be changed, of course.

Keep track of diff progress
You're running a program that reads LOTS of files and takes a long time. But it doesn't tell you about its progress. First, run a command in the background, e.g. $ find /usr/share/doc -type f -exec cat {} + > output_file.txt Then run the watch command. "watch -d" highlights the changes as they happen In bash: $! is the process id (pid) of the last command run in the background. You can change this to $(pidof my_command) to watch something in particular.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: