Commands tagged curl (212)

  • Searches for web radio by submitted keyword and returns the station name and the link for listing . May be enhanced to read user's selection and submit it to mplayer. Show Sample Output


    4
    echo "Keyword?";read keyword;query="http://www.shoutcast.com/sbin/newxml.phtml?search="$keyword"";curl -s $query |awk -F '"' 'NR <= 4 {next}NR>15{exit}{sub(/SHOUTcast.com/,"http://yp.shoutcast.com/sbin/tunein-station.pls?id="$6)}{print i++" )"$2}'
    benyounes · 2010-05-03 00:44:10 8
  • Each file in the current folder is uploaded to imageshack.us If the folder contains other filetypes change: for files in * to: for files in *.jpg (to upload ONLY .jpg files) Additionally you can try (results may vary): for files in *.jpg *.png The output URL is encased with BB image tags for use in a forum. Show Sample Output


    4
    imageshack() { for files in *; do curl -H Expect: -F fileupload="@$files" -F xml=yes -# "http://www.imageshack.us/index.php" | grep image_link | sed -e 's/<image_link>/[IMG]/g' -e 's/<\/image_link>/[\/IMG]/g'; done; }
    operatinghazard · 2010-10-01 06:50:04 6
  • Shorter and made into a function. Show Sample Output


    4
    googl () { curl -s -d "url=${1}" http://goo.gl/api/url | sed -n "s/.*:\"\([^\"]*\).*/\1\n/p" ;}
    dabom · 2010-10-03 02:52:44 3
  • Just add this function to your .zshrc / .bashrc, and by typing "shout *URL*" you get a randomly chosen English word that ShoutKey.com uses to short your URL. You may now go to shoutkey.com/*output_word* and get redirected. The URL will be valid for 5 minutes. (I've never used sed before, so I'll be quite glad if someone could straighten up the sed commands and combine them (perhaps also removing the whitespace). If so, I'll update it right away ;) ) Show Sample Output


    4
    shout () { curl -s "http://shoutkey.com/new?url=$1" | sed -n 's/\<h1\>/\&/p' | sed 's/<[^>]*>//g;/</N;//b' ;}
    elfreak · 2010-10-04 23:50:54 3
  • like 7300, but doesn't clutter your working directory with old qr.*.png files. This will get the QR barcode, and send it right into ImageMagick's 'display' tool. Usage is the same as 7300; just call this function followed by the URL: qrurl http://xkcd.com


    4
    qrurl() { curl -sS "http://chart.apis.google.com/chart?chs=200x200&cht=qr&chld=H|0&chl=$1" -o - | display -filter point -resize 600x600 png:-; }
    __ · 2010-12-16 04:42:05 3

  • 4
    curl -I g.cn
    cfy · 2011-03-27 14:27:23 6

  • 4
    date -s "$(curl -sD - www.example.com | grep '^Date:' | cut -d' ' -f3-6)Z"
    casueps · 2019-12-20 10:10:14 134
  • That makes a function you can put in your ~/.bashrc to run it when you need in any term with an IP as argument Show Sample Output


    3
    GeoipLookUp(){ curl -A "Mozilla/5.0" -s "http://www.geody.com/geoip.php?ip=$1" | grep "^IP.*$1" | html2text; }
    sputnick · 2009-11-06 00:32:27 4
  • use curl and sed to shorten an url via goo.gl


    3
    curl -s 'http://ggl-shortener.appspot.com/?url='"$1" | sed -e 's/{"short_url":"//' -e 's/"}/\n/g'
    mvrilo · 2010-03-26 22:31:06 22
  • runs an rss feed through sed replacing the closing tags with newlines and the opening tags with white space making it readable. Show Sample Output


    3
    curl --silent "FEED ADDRESS" |sed -e 's/<\/[^>]*>/\n/g' -e 's/<[^>]*>//g
    ljmhk · 2011-04-11 14:08:50 5
  • Just an alternative with more advanced formating for readability purpose. It now uses colors (too much for me but it's a kind of proof-of-concept), and adjust columns. Show Sample Output


    3
    curl -u username --silent "https://mail.google.com/mail/feed/atom" | awk 'BEGIN{FS="\n";RS="(</entry>\n)?<entry>"}NR!=1{print "\033[1;31m"$9"\033[0;32m ("$10")\033[0m:\t\033[1;33m"$2"\033[0m"}' | sed -e 's,<[^>]*>,,g' | column -t -s $'\t'
    frntn · 2011-10-15 23:15:52 3
  • required packages: curl, xml2, html2text command is truncated, see 'sample output' Show Sample Output


    3
    open R,"curl -s http://feeds2.feedburner.com/Command-line-fu|xml2|"; while(<R>){ chomp; m(^/rss/channel/item/title=) and do{ s/^.*?=//; ($t,$d,$l)=($_,undef,undef) }; m(^/rss/channel/item/description=) and do{ s/^.*?=//; push @d,$_ }; m(^/rss/channel/item
    bandie91 · 2012-02-24 23:40:02 5
  • Watches the headers of a curl, following any redirects and printing only the HTTP status and the location of the possible redirects. Show Sample Output


    3
    watch 'curl -s --location -I http://any.site.or.url | grep -e "\(HTTP\|Location\)"'
    theist · 2012-04-23 17:05:29 4
  • With the "--resolve" switch, you can avoid doing DNS lookups or edit the /etc/hosts file, by providing the IP address for a domain directly. Useful if you have many servers with different IP addresses behind a load balancer. Of course, you would loop it: for IP in 10.11.0.{1..10}; do curl --resolve subdomain.example.com:80:$IP subdomain.example.com -I -s; done


    3
    curl --resolve subdomain.example.com:80:10.100.0.1 subdomain.example.com -I -s
    atoponce · 2013-01-24 19:50:26 37

  • 3
    curl -s http://whatismyip.org/ | grep -oP '(\d{1,3}\.){3}\d+'
    ciekawy · 2016-07-11 18:07:37 15

  • 3
    GITUSER=$(whoami); curl "https://api.github.com/users/${GITUSER}/starred?per_page=1000" | grep -o 'git@[^"]*' | xargs -L1 git clone
    wuseman1 · 2022-06-25 20:39:12 413
  • miss a class at UTOSC2010? need a refresher? use this to curl down all the presentations from the UTOSC website. (http://2010.utosc.com) NOTE/WARNING this will dump them in the current directory and there are around 37 and some are big - tested on OSX10.6.1 Show Sample Output


    2
    b="http://2010.utosc.com"; for p in $( curl -s $b/presentation/schedule/ | grep /presentation/[0-9]*/ | cut -d"\"" -f2 ); do f=$(curl -s $b$p | grep "/static/slides/" | cut -d"\"" -f4); if [ -n "$f" ]; then echo $b$f; curl -O $b$f; fi done
    danlangford · 2009-10-11 17:28:46 3
  • Alternative to the ping check if your firewall blocks ping. Uses curl to get the landing page silently, or fail with an error code. You can probably do this with wget as well. Show Sample Output


    2
    curl -fs brandx.jp.sme 2&>1 > /dev/null || echo brandx.jp.sme ping failed | mail -ne -s'Server unavailable' joker@jp.co.uk
    mccalni · 2009-10-23 14:29:06 4
  • A function that takes a domain name as an argument Show Sample Output


    2
    geo(){ curl -s "http://www.geody.com/geoip.php?ip=$(dig +short $1)"| sed '/^IP:/!d;s/<[^>][^>]*>//g'; }
    dennisw · 2009-11-12 17:14:09 25
  • Requires display. Corrected version thanks to sputnick and eightmillion user.


    2
    display http://dilbert.com$(curl -s dilbert.com|grep -Po '"\K/dyn/str_strip(/0+){4}/.*strip.[^\.]*\.gif')
    wizel · 2009-12-05 19:35:27 13
  • This version prints current votes and commands for a user. Pass the user as an argument. While this technically "fits" as a one liner, it really is easier to look at as a shell script with extra whitespace. :) Show Sample Output


    2
    curl -s http://www.commandlinefu.com/commands/by/$1/xml | awk -F'</?div[^>]*>' '/class=\"command\"/{gsub(/&quot;/,"\"",$2); gsub(/&lt;/,"<",$2); gsub(/&gt;/,">",$2); gsub(/&amp;/,"\\&",$2); cmd=$2} /class=\"num-votes\"/{printf("%3i %s\n", $2, cmd)}'
    putnamhill · 2010-02-16 17:24:45 5
  • This will tell you which twitter user you are chronologically. For example, a number of 500 means you were the 500th user to create a twitter account. Show Sample Output


    2
    curl -s http://twitter.com/username | grep 'id="user_' | grep -o '[0-9]*'
    spiffwalker · 2010-04-04 18:43:14 10
  • There's another version on here that uses GET but some people don't have lwp-request, so here's an alternative. It's also a little shorter and should work with most youtube URLs since it truncates at the first &


    2
    url="[Youtube URL]"; echo $(curl ${url%&*} 2>&1 | grep -iA2 '<title>' | grep '-') | sed 's/^- //'
    rkulla · 2010-04-29 02:03:36 4
  • In this example 'git' is the user name and the output format is YAML but you can change this to XML or JSON, eg: curl http://github.com/api/v1/json/usernamehere Show Sample Output


    2
    curl http://github.com/api/v1/yaml/git
    rkulla · 2010-05-30 00:18:00 6
  • In this example we search for 'vim' but vim doesn't have a project on github right now. That's ok, this command still searches for every project that has 'vim' in their description (forks, plugins, etc). To get XML or JSON output just replace 'yaml' in the url with 'xml' or 'json'. Show Sample Output


    2
    curl http://github.com/api/v1/yaml/search/vim
    rkulla · 2010-05-30 00:29:03 3
  •  < 1 2 3 4 5 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

list all files in a directory, sorted in reverse order by modification time, use file descriptors.
It's both silly, and infinitely useful. Especially useful in logfile directories where you want to know what file is being updated while troubleshooting.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

BASH: Print shell variable into AWK

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Count the total number of files in each immediate subdirectory
counts the total (recursive) number of files in the immediate (depth 1) subdirectories as well as the current one and displays them sorted. Fixed, as per ashawley's comment

for too many arguments by *
$ grep ERROR *.log -bash: /bin/grep: Argument list too long $ echo *.log | xargs grep ERROR /dev/null 20090119.00011.log:DANGEROUS ERROR

get total of inodes of root partition

A signal trap that logs when your script was killed and what other processes were running at that time
trap is the bash builtin that allows you to execute commands when the current script receives a particular signal. Uses $0 for the script name, $$ for the script PID, tee to output to STDOUT as well as a log file and ps to log other running processes.

take a look to command before action
add |sh when you agree the list, I often use that method to prevent typos in dangerous or long operations

Get gzip compressed web page using wget.
Like the original command, but the -f allows this one to succeed even if the website returns uncompressed data. From gzip(1) on the -f flag: If the input data is not in a format recognized by gzip, and if the --stdout is also given, copy the input data without change to the standard output: let zcat behave as cat.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: