What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands tagged curl from sorted by
Terminal - Commands tagged curl - 190 results
curl -u twitter-username -d status="Hello World, Twitter!" -d source="cURL" http://twitter.com/statuses/update.xml
2009-12-08 14:54:33
User: MyTechieself
Tags: twitter curl api

An improvement of the original (at: http://www.commandlinefu.com/commands/view/2872/update-twitter-via-curl) in the sense that you see a "from cURL" under your status message instead of just a "from API" ;-) Twitter automatically links it to the cURL home page.

MyIps(){ echo -e "local:\n$(ifconfig $1 | grep -oP 'inet (add?r:)?\K(\d{1,3}\.){3}\d{1,3}')\n\npublic:\n$(curl -s sputnick-area.net/ip)"; }
2009-12-06 22:52:31
User: sputnick
Functions: echo

Like the tiltle said, you can use an argument too ( the interface )

MyIps eth0

will show only the IP of this interface and the public IP

( tested with Linux )

You can add that function in ~/.bashrc, then

. ~/.bashrc

Now you are ready to call this function in all your terms...

display http://dilbert.com$(curl -s dilbert.com|grep -Po '"\K/dyn/str_strip(/0+){4}/.*strip.[^\.]*\.gif')
2009-12-05 19:35:27
User: wizel
Functions: grep

Requires display.

Corrected version thanks to sputnick and eightmillion user.

curl -o id.gif `date +http://d.yimg.com/a/p/uc/%Y%m%d/largeimagecrwiz%y%m%d.gif`
2009-12-04 20:16:32
User: putnamhill
Tags: curl date

Requires the date command. This also works with some other comics. Here's a bash script that displays daily Garfield, Id, and Andy Capp:


wget `lynx --dump http://xkcd.com/|grep png`
curl -s 'xkcd.com' | awk -F\" '/^<img/{printf("<?xml version=\"1.0\"?>\n<xkcd>\n<item>\n <title>%s</title>\n <comment>%s</comment>\n \n</item>\n</xkcd>\n", $6, $4, $2)}'
2009-12-03 19:49:37
User: putnamhill
Functions: awk

I wasn't sure how to display the image, so I thought I'd try xml for a different twist.

lynx --dump --source http://www.xkcd.com | grep `lynx --dump http://www.xkcd.com | egrep '(png|jpg)'` | grep title | cut -d = -f2,3 | cut -d '"' -f2,4 | sed -e 's/"/|/g' | awk -F"|" ' { system("display " $1);system("echo "$2); } '
2009-12-03 18:53:57
Functions: awk cut egrep grep

Same thing just a different way to get there. You will need lynx

xkcd(){ wget -qO- http://xkcd.com/|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
2009-11-27 09:11:47
User: eightmillion
Functions: grep tee wget

This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic.

To get a random xkcd comic, I also use the following:

xkcdrandom(){ wget -qO- dynamic.xkcd.com/comic/random|tee >(feh $(grep -Po '(?<=")http://imgs[^/]+/comics/[^"]+\.\w{3}'))|grep -Po '(?<=(\w{3})" title=").*(?=" alt)';}
albumart(){ local y="$@";awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music');}
2009-11-15 19:54:16
User: eightmillion

This bash function uses albumart.org to find the cover for an album. It returns an amazon.com url to the image.

Usage: albumart [artist] [album]

These arguments can be reversed and if the album name is distinct enough, it may be possible to omit the artist.

The command can be extended with wget to automatically download the matching image like this:

albumart(){ local x y="$@";x=$(awk '/View larger image/{gsub(/^.*largeImagePopup\(.|., .*$/,"");print;exit}' <(curl -s 'http://www.albumart.org/index.php?srchkey='${y// /+}'&itempage=1&newsearch=1&searchindex=Music'));[ -z "$x" ]&&echo "Not found."||wget "$x" -O "${y}.${x##*.}";}
geo(){ curl -s "http://www.geody.com/geoip.php?ip=$(dig +short $1)"| sed '/^IP:/!d;s/<[^>][^>]*>//g'; }
2009-11-12 17:14:09
User: dennisw
Functions: sed

A function that takes a domain name as an argument

ompload() { curl -# -F file1=@"$1" http://ompldr.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;print}';}
2009-11-07 20:56:52
User: eightmillion
Functions: awk

This function uploads images to http://omploader.org and then prints out the links to the file.

Some coloring can also be added to the command with:

ompload() { curl -F file1=@"$1" http://omploader.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;sub(/^/,"\033[0;34m");sub(/:/,"\033[0m:");print}';}
GeoipLookUp(){ curl -A "Mozilla/5.0" -s "http://www.geody.com/geoip.php?ip=$1" | grep "^IP.*$1" | html2text; }
2009-11-06 00:32:27
User: sputnick
Functions: grep
Tags: sed curl

That makes a function you can put in your ~/.bashrc to run it when you need in any term with an IP as argument

curl -s "http://www.geody.com/geoip.php?ip=$(curl -s icanhazip.com)" | sed '/^IP:/!d;s/<[^>][^>]*>//g'
2009-11-04 07:15:02
User: getkaizer
Functions: sed
Tags: sed curl

Not my script. Belongs to mathewbauer. Used without his permission.

This script gives a single line as shown in the sample output.

NOTE: I have blanked out the IP address for obvious security reasons. But you will get whatever is your IP if you run the script.

Tested working in bash.

curl ip.appspot.com
2009-10-31 21:11:10
User: ktoso

Yeah I know it's been up here a million times, but this service is a really clean and nice one. Nothing but your IP address on it. Actually I was to write something like this, and noticed this on appspot... ;)

curl -fs brandx.jp.sme 2&>1 > /dev/null || echo brandx.jp.sme ping failed | mail -ne -s'Server unavailable' [email protected]
2009-10-23 14:29:06
User: mccalni
Functions: echo mail ping
Tags: bash ping curl mail

Alternative to the ping check if your firewall blocks ping. Uses curl to get the landing page silently, or fail with an error code. You can probably do this with wget as well.

b="http://2010.utosc.com"; for p in $( curl -s $b/presentation/schedule/ | grep /presentation/[0-9]*/ | cut -d"\"" -f2 ); do f=$(curl -s $b$p | grep "/static/slides/" | cut -d"\"" -f4); if [ -n "$f" ]; then echo $b$f; curl -O $b$f; fi done
2009-10-11 17:28:46
User: danlangford
Functions: cut echo grep
Tags: curl cut for UTOSC

miss a class at UTOSC2010? need a refresher? use this to curl down all the presentations from the UTOSC website. (http://2010.utosc.com) NOTE/WARNING this will dump them in the current directory and there are around 37 and some are big - tested on OSX10.6.1

curl "http://www.house.gov/house/MemberWWW.shtml" 2>/dev/null | sed -e :a -e 's/<[^>]*>//g;/</N;//ba' | perl -nle 's/^\t\t(.*$)/ $1/ and print;'
2009-09-24 23:37:36
User: drewk
Functions: perl sed
Tags: perl sed curl

Uses curl to download page of membership of US Congress. Use sed to strip HTML then perl to print a line starting with two tabs (a line with a representative)

echo $(date +%s) > start-time; URL=http://www.google.com; while true; do echo $(curl -L --w %{speed_download} -o/dev/null -s $URL) >> bps; sleep 10; done &
2009-09-19 21:26:06
User: matthewbauer
Functions: date echo sleep

This will log your internet download speed.

You can run

gnuplot -persist <(echo "plot 'bps' with lines")

to get a graph of it.

curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | sed -n "s/<title>\(.*\)<\/title.*name>\(.*\)<\/name>.*/\2 - \1/p"
2009-09-07 21:56:40
User: postrational
Functions: awk sed tr

Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages.

For some reason sed gets stuck on OS X, so here's a Perl version for the Mac:

curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*<name>(.*)<\/name>.*$/$2 - $1/'

If you want to see the name of the last person, who added a message to the conversation, change the greediness of the operators like this:

curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*?<name>(.*?)<\/name>.*$/$2 - $1/'
curl -Is slashdot.org | sed -ne '/^X-[FBL]/s/^X-//p'
curl -s http://tinyurl.com/create.php?url=http://<website.url>/ | sed -n 's/.*\(http:\/\/tinyurl.com\/[a-z0-9][a-z0-9]*\).*/\1/p' | uniq
curl -s "http://services.digg.com/stories?link=$NEWSURL&appkey=http://www.whatever.com&type=json" | python -m simplejson.tool | grep diggs
tweet(){ curl -u "$1" -d status="$2" "http://twitter.com/statuses/update.xml"; }
2009-08-23 16:56:24
User: Code_Bleu

Type the command in the terminal and press enter to create the tweet() function. Then run as follows:

tweet MyTwitterAccount "My message goes here"

It will prompt you for password. Make sure that you use escape "\" character in message for showing varialbles or markup.

curl http://www.phrack.org/archives/tgz/phrack[1-67].tar.gz -o phrack#1.tar.gz
urls=('www.ubuntu.com' 'google.com'); for i in ${urls[@]}; do http_code=$(curl -I -s $i -w %{http_code}); echo $i status: ${http_code:9:3}; done