Validates and pretty-prints the content fetched from the URL. Show Sample Output
`aria2c` (from the aria2 project) allows. Change -s 4 to an arbitrary number of segments to control the number of concurrent connections. It is also possible to provide multiple URLs to the same content (potentially over multiple protocols) to download the file concurrently from multiple hosts.
It finds, specifically, the connections to the HTTP and HTTPS ports as source ports. You can check for destination ports as well. Show Sample Output
jrk's aria2 example is incorrect. -s specifies the global connection limit; the per-host connection limit is specified with -x.
This one is tried and tested for Ubuntu 12.04. Works great for tailing any file over http.
Raise your hand if you haven't used this at least once to share a directory quickly
This command will show the current GMT time using HTTP. This might be useful if you just want to know what's the current human-readable and accurate-enough time, without changing the system time, using a simple command that would work regardless of the availability of NTP. Note: To get a quicker and more accurate response, replace google.com with your local NTP server. Also can be used as an alternative to the "htpdate" program: http://www.commandlinefu.com/commands/view/668/set-your-computers-clock-using-http-and-htp-http-time-protocol-when-ntpsntp-is-not-available Show Sample Output
Downloads the entire file, but http servers don't always provide the optional 'Content-Length:' header, and ftp/gopher/dict/etc servers don't provide a filesize header at all.
trace http requests on the specified interface. uses the amazing tshark tool (http://www.wireshark.org/docs/man-pages/tshark.html) Show Sample Output
Easiest way to check which modules are loaded in apache. Show Sample Output
Ever wanted to stream your favorite podcast across the network, well now you can. This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption. Show Sample Output
Generate a Netscape cookies file to use with Python's mechanize.
Where src or dst is the host that you want to view the HTTP header.
Similar to the following:
curl -I <URL>
but curl -I performs a HEAD request, which can yield different results.
HTTP Get, without LWP::Simple Show Sample Output
activate the first alert and the next ones are activated automatically. Show Sample Output
urls.txt should have a fully qualified url on each line
prefix with
rm log.txt;
to clear the log
change curl command to
curl --head $file | head -1 >> log.txt
to just get the http status
Show Sample Output
This is a better way to do the "src X or dst X" filter; plus you might not want to bother with DNS lookups (-n).
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: