Dump HTTP header using lynx or w3m

lynx -dump -head http://www.example.com/
Without the -dump option the header is displayed in lynx. You can also use w3m, the command then is w3m -dump_head http://www.example.com/
Sample Output
HTTP/1.1 200 OK
Date: Tue, 31 Mar 2009 18:14:46 GMT
Server: Apache/2.2.3 (CentOS)
Last-Modified: Tue, 15 Nov 2005 13:24:10 GMT
ETag: "b80f4-1b6-80bfd280"
Accept-Ranges: bytes
Content-Length: 438
Connection: close
Content-Type: text/html; charset=UTF-8

-1
By: penpen
2009-03-31 18:41:36

These Might Interest You

  • Same thing just a different way to get there. You will need lynx


    -1
    lynx --dump --source http://www.xkcd.com | grep `lynx --dump http://www.xkcd.com | egrep '(png|jpg)'` | grep title | cut -d = -f2,3 | cut -d '"' -f2,4 | sed -e 's/"/|/g' | awk -F"|" ' { system("display " $1);system("echo "$2); } '
    solarislackware · 2009-12-03 18:53:57 0
  • I save this to bin/iptrace and run "iptrace ipaddress" to get the Country, City and State of an ip address using the http://ipadress.com service. I add the following to my script to get a tinyurl of the map as well: URL=`lynx -dump http://www.ip-adress.com/ip_tracer/?QRY=$1|grep details|awk '{print $2}'` lynx -dump http://tinyurl.com/create.php?url=$URL|grep tinyurl|grep "19. http"|awk '{print $2}'


    24
    lynx -dump http://www.ip-adress.com/ip_tracer/?QRY=$1|grep address|egrep 'city|state|country'|awk '{print $3,$4,$5,$6,$7,$8}'|sed 's\ip address flag \\'|sed 's\My\\'
    leftyfb · 2009-02-25 17:16:56 5
  • Get all URLs from website via Regular Expression... You must have lynx installed in your computer to execute the command. --> lynx --dump "" | egrep -o "" - Must substitute it for the website path that you want to extract the URLs - Regular Expression that you wanna filter the website Show Sample Output


    0
    lynx --dump "http://www.google.com.br" | egrep -o "http:.*"
    felipelageduarte · 2011-09-05 01:12:15 0
  • lynx -dump ip.nu Your IP address is 0.0.0.0


    0
    lynx -dump ip.nu
    ariver · 2012-05-02 14:54:12 0

What Others Think

There seems to be a problem with the Wiki software used on this site; the a tag of course is not part of the command!
penpen · 481 weeks and 2 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: