What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands tagged curl from sorted by
Terminal - Commands tagged curl - 184 results
say() { curl -sA Mozilla -d q=`python3 -c 'from urllib.parse import quote_plus; from sys import stdin; print(quote_plus(stdin.read()[:100]))' <<<"$@"` 'http://translate.google.com/translate_tts' | mpg123 -q -; }
curl icanhazip.com
for i in {1..40};do echo -n $i. $(date +%H:%M:%S):\ ; (time curl 'http://ya.ru/' &> /dev/null) 2>&1|grep real;sleep 1;done
2011-11-11 10:40:38
User: AntonyC
Functions: date echo grep sleep time
Tags: curl

This uses curl to find out the access times of a web service

curl -O http://www.site.com/img/image[001-175].jpg
curl -b cookie.txt http://www.site.com/download/file.txt
curl -c cookie.txt -d username=hello -d password=w0r1d http://www.site.com/login
curl sputnick-area.net/ua
2011-10-24 09:57:50
User: sputnick
Tags: curl headers

That's useful when you're doing some web scraping http://en.wikipedia.org/wiki/Web_scraping and you're trying to test your possibly fake user-agent.

expandurl() { curl -sIL $1 2>&1 | awk '/^Location/ {print $2}' | tail -n1; }
2011-10-19 01:35:33
Functions: awk tail
Tags: curl

This shell function uses curl(1) as it is more portable than wget(1) across Unices, to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. It is a refinement to www.commandlinefu.com/commands/view/9515/expand-shortened-urls to make it better for use in scripts. Only displays final result.

expandurl http://t.co/LDWqmtDM
expandurl() { curl -sIL $1 | grep ^Location; }
2011-10-19 00:56:53
User: atoponce
Functions: grep
Tags: curl

curl(1) is more portable than wget(1) across Unices, so here is an alternative doing the same thing with greater portability. This shell function uses curl(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from:

expandurl http://t.co/LDWqmtDM
curl -u username --silent "https://mail.google.com/mail/feed/atom" | awk 'BEGIN{FS="\n";RS="(</entry>\n)?<entry>"}NR!=1{print "\033[1;31m"$9"\033[0;32m ("$10")\033[0m:\t\033[1;33m"$2"\033[0m"}' | sed -e 's,<[^>]*>,,g' | column -t -s $'\t'
2011-10-15 23:15:52
User: frntn
Functions: awk column sed

Just an alternative with more advanced formating for readability purpose. It now uses colors (too much for me but it's a kind of proof-of-concept), and adjust columns.

while sleep 30; do tput sc;tput cup 0 $(($(tput cols)-15));echo -n " New Emails: $(curl -u username:password --silent https://mail.google.com/mail/feed/atom | grep 'fullcount' | grep -o '[0-9]\+')";tput rc; done &
curl pagerank.bz/yourdomain.com
curl http://my-ip.cc/host.json
2011-09-01 00:31:49
User: samleb

JSON version.

Additionally it may give your geolocation if it's known by hostip.info

curl http://my-ip.cc/host.xml
2011-09-01 00:30:03
User: samleb

XML version.

Additionally it may give your geolocation if it's known by hostip.info

curl http://my-ip.cc/host.txt
2011-09-01 00:28:49
User: samleb

Additionally it may give your geolocation if it's known by hostip.info

function expand_url() { curl -sI $1 | grep Location: | cut -d " " -f 2 | tr -d "\n" | pbcopy }
2011-08-21 05:30:09
User: gt
Functions: cut grep tr

Expand a URL, aka do a head request, and get the URL. Copy this value to clipboard.

curl -s http://www.perl.org/get.html | grep -m1 '\.tar\.gz' | sed 's/.*perl-//; s/\.tar\.gz.*//'
if curl -s -I -H "Accept-Encoding: gzip,deflate" http://example.com/ | grep 'Content-Encoding: gzip' >/dev/null 2>&1 ; then echo Yes; else echo No;fi
curl -I -H "Accept-Encoding: gzip,deflate" http://example.org
isgd () { curl 'http://is.gd/create.php?format=simple&url='"$1" ; printf "\n" }
2011-08-14 23:31:39
User: dbbolton
Functions: printf
Tags: curl shorturl url

Check the API. You shouldn't need sed. The print-newline at the end is to prevent zsh from inserting a % after the end-of-output.

Also works with http://v.gd

curl -s --compressed http://funnyjunk.com | awk -F'"' '/ '"'"'mainpagetop24h'"'"'/ { print "http://funnyjunk.com"$4 }' | xargs curl -s | grep -o 'ht.*m/pictures/.*\.jpg\|ht.*m/gifs/.*\.gif' | grep "_......_" | uniq | xargs wget
2011-07-21 15:57:21
User: laniner
Functions: awk uniq xargs

If your version of curl does not support the --compressed option, use

curl -s http://funnyjunk.com | gunzip

instead of

curl -s --compressed http://funnyjunk.com
curl -s "$URL" |wc -c
2011-07-18 15:47:57
User: Mozai
Functions: wc
Tags: size curl http

Downloads the entire file, but http servers don't always provide the optional 'Content-Length:' header, and ftp/gopher/dict/etc servers don't provide a filesize header at all.

curl --silent "FEED ADDRESS" |sed -e 's/<\/[^>]*>/\n/g' -e 's/<[^>]*>//g
2011-04-11 14:08:50
User: ljmhk
Functions: sed
Tags: curl rss

runs an rss feed through sed replacing the closing tags with newlines and the opening tags with white space making it readable.

curl -I g.cn
curl -s http://whatthecommit.com/index.txt | cowsay