Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged curl from sorted by
Terminal - Commands tagged curl - 180 results
spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://www.google.com/tbproxy/spell|sed -n '/s="[0-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}'|tee >(grep -Eq '.*'||echo -e "OK");}
2010-02-17 08:20:48
User: eightmillion
Functions: echo grep sed tee
5

I took matthewbauer's cool one-liner and rewrote it as a shell function that returns all the suggestions or outputs "OK" if it doesn't find anything wrong. It should work on ksh, zsh, and bash. Users that don't have tee can leave that part off like this:

spellcheck(){ typeset y=$@;curl -sd "<spellrequest><text>$y</text></spellrequest>" https://google.com/tbproxy/spell|sed -n '/s="[1-9]"/{s/<[^>]*>/ /g;s/\t/ /g;s/ *\(.*\)/Suggestions: \1\n/g;p}';}
curl -s http://www.commandlinefu.com/commands/by/$1/xml | awk -F'</?div[^>]*>' '/class=\"command\"/{gsub(/&quot;/,"\"",$2); gsub(/&lt;/,"<",$2); gsub(/&gt;/,">",$2); gsub(/&amp;/,"\\&",$2); cmd=$2} /class=\"num-votes\"/{printf("%3i %s\n", $2, cmd)}'
2010-02-16 17:24:45
User: putnamhill
Functions: awk
2

This version prints current votes and commands for a user. Pass the user as an argument. While this technically "fits" as a one liner, it really is easier to look at as a shell script with extra whitespace. :)

weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "%s: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}
2010-02-10 01:23:39
User: eightmillion
Functions: perl
7

This shell function grabs the weather forecast for the next 24 to 48 hours from weatherunderground.com. Replace <YOURZIPORLOCATION> with your zip code or your "city, state" or "city, country", then calling the function without any arguments returns the weather for that location. Calling the function with a zip code or place name as an argument returns the weather for that location instead of your default.

To add a bit of color formatting to the output, use the following instead:

weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "\x1B[0;34m%s\x1B[0m: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}

Requires: perl, curl

fbemailscraper YourFBEmail Password
2010-01-31 00:44:35
User: dabom
4

(Apparently it is too long so I put it in sample output, I hope that is OK.)

Run the long command (or put it in your .bashrc) in sample output then run:

fbemailscraper YourFBEmail Password

Voila! Your contacts' emails will appear.

Facebook seems to have gotten rid of the picture encoding of emails and replaced it with a text based version making it easy to scrape!

Needs curl to run and it was made pretty quickly so there might be bugs.

echo alias xkcd="gwenview `w3m -dump http://xkcd.com/|grep png | awk '{print $5}'` 2> /dev/null" >> .bashrc
2010-01-30 20:38:16
User: GinoMan2440
Functions: alias echo
-5

Add an alias to your .bashrc that allows you to issue the command xkcd to view (with gwenview) the newest xkcd comic... I know there are thousands of them out there but this one is at least replete with installer and also uses a more concise syntax... plus, gwenview shows you the downloading progress as it downloads the comic and gives you a more full featured viewing experience.

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
2010-01-30 13:08:03
User: gthb
Functions: grep sed
7

This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed).

Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
2010-01-29 05:01:11
User: eightmillion
Functions: grep perl
18

This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';}

If your version of grep doesn't have perl compatible regex support, then you can use this version:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
curl -u <username> http://app.boxee.tv/api/get_queue | xml2 | grep /boxeefeed/message/description | awk -F= '{print $2}'
2010-01-20 16:17:19
User: Strawp
Functions: awk grep
Tags: curl xml boxee
0

Might be able to do it in less steps with xmlstarlet, although whether that would end up being shorter overall I don't know - xmlstarlet syntax confuses the heck out of me.

Prompts for your password, or if you're a bit mental you can add your password into the command itself in the format "-u user:password".

for file in $(seq -f '%03.f' 1 $TOTAL ); do echo "($file/$TOTAL)"; curl -f -O http://domain.com/Name_$file.ext; done
2010-01-12 15:23:44
User: nordri
Functions: echo file seq
-4

With counter format [001, 002, ..., 999] , nice with pictures or wallpapers collections.

date -s "`curl -sI www.example.com | sed -n 's/^Date: //p'`"
2010-01-08 21:05:41
User: putnamhill
Functions: date
Tags: curl date
0

If you don't have netcat, you can use curl.

curl -sS -o $outfile -m $showlengthinseconds $streamurl
2009-12-22 22:37:42
User: sud0er
Tags: curl
1

I use this with cron to timeshift radio programs from a station's live stream.

You will get an error message at the end like "curl: (28) Operation timed out after 10000 milliseconds with 185574 bytes received"; to suppress that but not other error messages, you can append "2>&1 | grep -v "(28)"" to the end of the command.

curl -s search.twitter.com | awk -F'</?[^>]+>' '/\/intra\/trend\//{print $2}'
curl -sL 'dynamic.xkcd.com/comic/random/' | awk -F\" '/^<img/{printf("<?xml version=\"1.0\"?>\n<xkcd>\n<item>\n <title>%s</title>\n <comment>%s</comment>\n \n</item>\n</xkcd>\n", $6, $4, $2)}'
for /L %%x in (1,1,16) do mkdir %%x & curl -R -e http://www.kirtu.com -o %%x/#1.jpg http://www.kirtu.com/toon/content/sb%x/english/sb%x_en_[001-070].jpg
2009-12-08 15:01:16
User: MyTechieself
Functions: mkdir
-1

Bulk downloads the comic strip JPG files for the adult cartoon Savitabhabhi, storing each set in it's own folder. Requires manual removal of "non-image" files that maybe created because each series may differ in length. The command can be easily adapted for UNIX flavours. You need to have cURL in your path.

curl -u twitter-username -d status="Hello World, Twitter!" -d source="cURL" http://twitter.com/statuses/update.xml
2009-12-08 14:54:33
User: MyTechieself
Tags: twitter curl api
12

An improvement of the original (at: http://www.commandlinefu.com/commands/view/2872/update-twitter-via-curl) in the sense that you see a "from cURL" under your status message instead of just a "from API" ;-) Twitter automatically links it to the cURL home page.

MyIps(){ echo -e "local:\n$(ifconfig $1 | grep -oP 'inet (add?r:)?\K(\d{1,3}\.){3}\d{1,3}')\n\npublic:\n$(curl -s sputnick-area.net/ip)"; }
2009-12-06 22:52:31
User: sputnick
Functions: echo
1

Like the tiltle said, you can use an argument too ( the interface )

MyIps eth0

will show only the IP of this interface and the public IP

( tested with Linux )

You can add that function in ~/.bashrc, then

. ~/.bashrc

Now you are ready to call this function in all your terms...

display http://dilbert.com$(curl -s dilbert.com|grep -Po '"\K/dyn/str_strip(/0+){4}/.*strip.[^\.]*\.gif')
2009-12-05 19:35:27
User: wizel
Functions: grep
2

Requires display.

Corrected version thanks to sputnick and eightmillion user.

curl -o id.gif `date +http://d.yimg.com/a/p/uc/%Y%m%d/largeimagecrwiz%y%m%d.gif`
2009-12-04 20:16:32
User: putnamhill
Tags: curl date
0

Requires the date command. This also works with some other comics. Here's a bash script that displays daily Garfield, Id, and Andy Capp:

http://putnamhill.net/cgi-bin/yahoo-comics.sh?ga,crwiz,crcap

wget `lynx --dump http://xkcd.com/|grep png`
curl -s 'xkcd.com' | awk -F\" '/^<img/{printf("<?xml version=\"1.0\"?>\n<xkcd>\n<item>\n <title>%s</title>\n <comment>%s</comment>\n \n</item>\n</xkcd>\n", $6, $4, $2)}'
2009-12-03 19:49:37
User: putnamhill
Functions: awk
0

I wasn't sure how to display the image, so I thought I'd try xml for a different twist.

lynx --dump --source http://www.xkcd.com | grep `lynx --dump http://www.xkcd.com | egrep '(png|jpg)'` | grep title | cut -d = -f2,3 | cut -d '"' -f2,4 | sed -e 's/"/|/g' | awk -F"|" ' { system("display " $1);system("echo "$2); } '
2009-12-03 18:53:57
Functions: awk cut egrep grep
-1

Same thing just a different way to get there. You will need lynx

xkcd(){ wget -qO- http://xkcd.com/|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
2009-11-27 09:11:47
User: eightmillion
Functions: grep tee wget
24

This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic.

To get a random xkcd comic, I also use the following:

xkcdrandom(){ wget -qO- dynamic.xkcd.com/comic/random|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
albumart(){ local y="$@";awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music');}
2009-11-15 19:54:16
User: eightmillion
7

This bash function uses albumart.org to find the cover for an album. It returns an amazon.com url to the image.

Usage: albumart [artist] [album]

These arguments can be reversed and if the album name is distinct enough, it may be possible to omit the artist.

The command can be extended with wget to automatically download the matching image like this:

albumart(){ local x y="$@";x=$(awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music'));[ -z "$x" ]&&echo "Not found."||wget "$x" -O "${y}.${x##*.}";}
geo(){ curl -s "http://www.geody.com/geoip.php?ip=$(dig +short $1)"| sed '/^IP:/!d;s/<[^>][^>]*>//g'; }
2009-11-12 17:14:09
User: dennisw
Functions: sed
1

A function that takes a domain name as an argument

ompload() { curl -# -F file1=@"$1" http://ompldr.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;print}';}
2009-11-07 20:56:52
User: eightmillion
Functions: awk
8

This function uploads images to http://omploader.org and then prints out the links to the file.

Some coloring can also be added to the command with:

ompload() { curl -F file1=@"$1" http://omploader.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;sub(/^/,"\033[0;34m");sub(/:/,"\033[0m:");print}';}