What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Commands tagged http from sorted by
Terminal - Commands tagged http - 19 results
curl -sSi <URL> | sed '/^\r$/q'
2014-08-27 15:47:05
User: cbuckley
Functions: sed
Tags: curl http

Similar to the following:

curl -I <URL>

but curl -I performs a HEAD request, which can yield different results.

aria2c -x 4 http://my/url
2014-07-26 03:06:33
User: lx45803

jrk's aria2 example is incorrect. -s specifies the global connection limit; the per-host connection limit is specified with -x.

(echo -e "HTTP/1.1 200 Ok\n\r"; tail -f /var/log/syslog) | nc -l 1234
2013-02-09 06:15:42
User: adimania
Functions: echo tail

This one is tried and tested for Ubuntu 12.04. Works great for tailing any file over http.

tcpdump -s 1024 -l -A -n host
2013-02-02 22:29:52
User: wejn
Functions: host tcpdump
Tags: http tcpdump

This is a better way to do the "src X or dst X" filter; plus you might not want to bother with DNS lookups (-n).

tcpdump -s 1024 -l -A src or dst
2013-02-01 15:03:12
User: bleiva
Functions: tcpdump
Tags: http tcpdump

Where src or dst is the host that you want to view the HTTP header.

curl -L -d "uid=<username>&pwd=<password>" http://www.example.com -c cookies.txt
2012-11-10 19:08:45
User: drwlrsn
Tags: curl http python

Generate a Netscape cookies file to use with Python's mechanize.

apache2 -l
httpd -M
2012-05-27 12:04:17
User: rockon

Easiest way to check which modules are loaded in apache.

curl -s "$URL" |wc -c
2011-07-18 15:47:57
User: Mozai
Functions: wc
Tags: size curl http

Downloads the entire file, but http servers don't always provide the optional 'Content-Length:' header, and ftp/gopher/dict/etc servers don't provide a filesize header at all.

curl -Is google.com | grep Date
2011-06-24 11:19:47
User: d3Xt3r
Functions: grep
Tags: http date time

This command will show the current GMT time using HTTP. This might be useful if you just want to know what's the current human-readable and accurate-enough time, without changing the system time, using a simple command that would work regardless of the availability of NTP.

Note: To get a quicker and more accurate response, replace google.com with your local NTP server.

Also can be used as an alternative to the "htpdate" program:


tshark -i en1 -z proto,colinfo,http.request.uri,http.request.uri -R http.request.uri
curl -I http://localhost
for file in `cat urls.txt`; do echo -n "$file " >> log.txt; curl --head $file >> log.txt ; done
2010-10-19 02:54:13
User: Glutnix
Functions: echo file

urls.txt should have a fully qualified url on each line

prefix with

rm log.txt;

to clear the log

change curl command to

curl --head $file | head -1 >> log.txt

to just get the http status

curl -L -s `curl -s http://www.2600.com/oth-broadband.xml` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"

Ever wanted to stream your favorite podcast across the network, well now you can.

This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption.

tcpdump -i eth0 port 80 -w -
curl -s -L --head -w "%{http_code}\n" URL | tail -n1
curl -s "http://feeds.delicious.com/v2/json?count=5" | python -m json.tool | less -R
2010-03-24 09:15:12
User: keimlink
Functions: less python

Validates and pretty-prints the content fetched from the URL.

aria2c -s 4 http://my/url
2009-08-11 22:34:00
User: jrk

`aria2c` (from the aria2 project) allows. Change -s 4 to an arbitrary number of segments to control the number of concurrent connections. It is also possible to provide multiple URLs to the same content (potentially over multiple protocols) to download the file concurrently from multiple hosts.

curl -i -X HEAD http://localhost/