Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged curl from sorted by
Terminal - Commands tagged curl - 175 results
findlocation() { place=`echo $* | sed 's/ /%20/g'` ; curl -s "http://maps.google.com/maps/geo?output=json&oe=utf-8&q=$place" | grep -e "address" -e "coordinates" | sed -e 's/^ *//' -e 's/"//g' -e 's/address/Full Address/';}
findlocation() { place=`echo $1 | sed 's/ /%20/g'` ; curl -s "http://maps.google.com/maps/geo?output=json&oe=utf-8&q=$place" | grep -e "address" -e "coordinates" | sed -e 's/^ *//' -e 's/"//g' -e 's/address/Full Address/';}
2010-10-18 21:11:42
User: shadyabhi
Functions: grep sed
Tags: curl google
2

Just add this to your .bashrc file.

Use quotes when query has multiple word length.

for url in `cat urls `; do title=`curl $url 2>&1 | grep -i '<title>.*</title>'` && curl $url > /tmp/u && mail -s "$title" your-private-instapaper-address@instapaper.com < /tmp/u ; done
2010-10-16 19:10:19
Functions: grep mail
-1

Note, you need to replace the email address with your private Instapaper email address.

There are a bunch of possible improvements such as,

- Not writing a temp file

- Doesnt strip tags (tho Instapaper does thankfully)

- Shouldnt require 2 curls

curl ifconfig.me
Check the Description below.
2010-10-07 04:22:32
User: hunterm
-1

The command was too long for the command box, so here it is:

echo $(( `wget -qO - http://i18n.counter.li.org/ | grep 'users registered' | sed 's/.*\<font size=7\>//g' | tr '\>' ' ' | sed 's/<br.*//g' | tr ' ' '\0'` + `curl --silent http://www.dudalibre.com/gnulinuxcounter?lang=en | grep users | head -2 | tail -1 | sed 's/.*<strong>//g' | sed 's/<\/strong>.*//g'` ))

This took me about an hour to do. It uses wget and curl because, dudalibre.com blocks wget, and wget worked nicely for me.

shout() { curl -s "http://shoutkey.com/new?url=${1}" | sed -n "/<h1>/s/.*href=\"\([^\"]*\)\".*/\1/p" ;}
shout () { curl -s "http://shoutkey.com/new?url=$1" | sed -n 's/\<h1\>/\&/p' | sed 's/<[^>]*>//g;/</N;//b' ;}
2010-10-04 23:50:54
User: elfreak
Functions: sed
4

Just add this function to your .zshrc / .bashrc, and by typing "shout *URL*" you get a randomly chosen English word that ShoutKey.com uses to short your URL. You may now go to shoutkey.com/*output_word* and get redirected. The URL will be valid for 5 minutes.

(I've never used sed before, so I'll be quite glad if someone could straighten up the sed commands and combine them (perhaps also removing the whitespace). If so, I'll update it right away ;) )

curl -n --ssl-reqd --mail-from "<user@gmail.com>" --mail-rcpt "<user@server.tld>" --url smtps://smtp.gmail.com:465 -T file.txt
2010-10-03 15:44:53
User: mitry
12

Required curl version >=7.21; using .netrc for authorization

googl () { curl -s -d "url=${1}" http://goo.gl/api/url | sed -n "s/.*:\"\([^\"]*\).*/\1\n/p" ;}
curl -s -d'&url=URL' http://goo.gl/api/url | sed -e 's/{"short_url":"//' -e 's/","added_to_history":false}/\n/'
2010-10-01 23:20:08
User: Soubsoub
Functions: sed
5

Use curl and sed to shorten an URL using goo.gl without any other api

imageshack() { for files in *; do curl -H Expect: -F fileupload="@$files" -F xml=yes -# "http://www.imageshack.us/index.php" | grep image_link | sed -e 's/<image_link>/[IMG]/g' -e 's/<\/image_link>/[\/IMG]/g'; done; }
2010-10-01 06:50:04
Functions: grep sed
Tags: curl for loop
4

Each file in the current folder is uploaded to imageshack.us

If the folder contains other filetypes

change:

for files in *

to:

for files in *.jpg

(to upload ONLY .jpg files)

Additionally you can try (results may vary):

for files in *.jpg *.png

The output URL is encased with BB image tags for use in a forum.

check(){ curl -sI $1 | sed -n 's/Location: *//p';}
curl -s http://urlxray.com/display.php?url=http://tinyurl.com/demo-xray | grep -o '<title>.*</title>' | sed 's/<title>.*--> \(.*\)<\/title>/\1/g'
2010-09-30 10:25:18
User: karpoke
Functions: grep sed
Tags: sed grep curl
-3

We can put this inside a function:

fxray() { curl -s http://urlxray.com/display.php?url="$1" | grep -o '<title>.*</title>' | sed 's/<title>.*--> \(.*\)<\/title>/\1/g'; }; fxray http://tinyurl.com/demo-xray
eog `curl -s http://xkcd.com/ | sed -n 's/<h3>Image URL.*: \(.*\)<\/h3>/\1/p'`
xdg-open http://xkcd.com/
2010-08-25 19:14:11
-5

KISS

To get a random xkcd comic:

xdg-open http://dynamic.xkcd.com/random/comic/
xkcd() { wget -qO- http://xkcd.com/ | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash ; }
2010-08-25 15:44:31
User: John_W
Functions: bash sed wget
0

This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic.

To get a random xkcd comic use the following:

xkcdrandom() { wget -qO- http://dynamic.xkcd.com/comic/random | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash; }

These are just a bit shorter than the ones eigthmillion wrote, however his version didn't work as expected on my laptop for some reason (I got the title-tag first), so these build a command which is executed by bash.

eog `curl 'http://xkcd.com/' | awk -F "ng): |</h" '/embedding/{print $2}'`
gwenview `wget -O - http://xkcd.com/ | grep 'png' | grep '<img src="http://imgs.xkcd.com/comics/' | sed s/title=\".*//g | sed 's/.png\"/.png/g' | sed 's/<img src=\"//g'`
2010-08-24 22:21:51
User: hunterm
Functions: grep sed
-1

Output the html from xkcd's index.html, filter out the html tags, and then view it in gwenview.

curl -sL 'www.commandlinefu.com/commands/random' | awk -F'</?[^>]+>' '/"command"/{print $2}'
2010-08-13 11:42:42
User: putnamhill
Functions: awk
Tags: awk curl random
0

Splitting on tags in awk is a handy way to parse html.

curl -L -s `curl -s [http://podcast.com/show.rss]` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
2010-07-31 00:17:47
User: denzuko
Functions: head ssh
0

Gets the latest podcast show from from your favorite Podcast. Uses curl and xmlstarlet.

Make sure you change out the items between brackets.

curl -L -s `curl -s http://www.2600.com/oth-broadband.xml` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
0

Ever wanted to stream your favorite podcast across the network, well now you can.

This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption.

FOR /f %%g in ('echo %1 ^| iconv -f gbk -t utf-8') DO curl -x proxy:port -u user:pass -d status=%%g -d source="cURL" http://twitter.com/statuses/update.xml
2010-07-21 04:53:54
User: MeaCulpa
Functions: iconv
-5

Aside from curl one will need iconv windows binary since windows lacks a native utf-8 cli interface. In my case I need a proxy in China and iconv to convert gbk status string into utf-8. GnuWin32 is a good choice with loads of coreutils natively ported to Windows

"FOR /f" is the solution to pass iconv output to curl.

curl -s "http://www.socrata.com/api/views/vedg-c5sb/rows.json?search=Axelrod" | grep "data\" :" | awk '{ print $17 }'
2010-07-01 23:54:54
User: mheadd
Functions: awk grep
Tags: awk grep curl
2

Query the Socrata Open Data API being used by the White House to find any employee's salary using curl, grep and awk.

Change the value of the search parameter (example uses Axelrod) to the name of any White House staffer to see their annual salary.

wget -qO- icanhazip.com
2010-06-24 03:49:14
Functions: wget
11

Curl is not installed by default on many common distros anymore. wget always is :)

wget -qO- ifconfig.me/ip
lynx --dump icanhazip.com