cut -d'/' -f3 file | sort | uniq -c

Count accesses per domain

count the times a domain appears on a file which lines are URLs in the form http://domain/resource.
Sample Output
6 www.domain.one.com
11 www.domain.two.com
3 www.otherdomain.com
2 www.domainfour.es

2
2010-05-23 16:02:51

These Might Interest You

  • This alias finds identical lines in a file (or pipe) and prints a sorted count of them (the name "sucs" descends from the first letters of the commands). The first example shows the number of logins of users; the one who logged in most often comes last. The second example extracts web client IP addresses from a log file, then pipes the result through the "sucs" alias to find out which clients are performing the most accesses. Or pipe the first column of ps(1) output through "sucs" to see how many processes your users are running. Show Sample Output


    0
    alias sucs="sort | uniq -c | sort -n"
    inof · 2009-07-21 10:55:06 0
  • Change the $domain variable to whichever domain you wish to query. Works with the majority of whois info; for some that won't, you may have to compromise: domain=google.com; for a in $(whois $domain | grep "Domain servers in listed order:" --after 3 | grep -v "Domain servers in listed order:"); do echo ">>> Nameservers for $domain from $a Note that this doesn't work as well as the first one; if they have more than 3 nameservers, it won't hit them all. As the summary states, this can be useful for making sure the whois nameservers for a domain match the nameserver records (NS records) from the nameservers themselves. Show Sample Output


    2
    domain=google.com; for ns in $(whois $domain | awk -F: '/Name Server/{print $2}'); do echo ">>> Nameservers for $domain from $a <<<"; dig @$ns $domain ns +short; echo; done;
    laebshade · 2011-05-08 04:46:34 0
  • Returns nothing if the domain exists and 'No match for domain.com' otherwise.


    6
    whois domainnametocheck.com | grep match
    Timothee · 2009-08-11 13:33:25 1
  • Sets the @ A record for your domain hosted by namecheap to your current internet-facing IP address, logs success or failure with syslog, and logs the data returned to /root/dnsupdate. Change the XXX's as appropriate. More info at: http://www.namecheap.com/support/knowledgebase/article.aspx/29/ Show Sample Output


    1
    logger -tdnsupdate $(curl -s 'https://dynamicdns.park-your-domain.com/update?host=@&domain=xxx&password=xxx'|tee -a /root/dnsupdate|perl -pe'/Count>(\d+)<\/Err/;$_=$1eq"0"?"Update Sucessful":"Update failed"'&&date>>/root/dnsupdate)
    MagisterQuis · 2013-08-11 16:27:39 0
  • You would need pwgen installed first, on ubuntu you can get it by apt-get sudo apt-get install pwgen Show Sample Output


    3
    for domain in $(pwgen -1A0B 6 10); do echo -ne "$domain.com "; if [ -z "$(whois -H $domain.com | grep -o 'No match for')" ]; then echo -ne "Not "; fi; echo "Available for register"; done
    mariocesar · 2011-01-26 01:10:52 2
  • This one has a better performance, as it is a one pass count with awk. For this script it might not matter, but for others it is a good optiomization.


    0
    svn ls -R | egrep -v -e "\/$" | xargs svn blame | awk '{count[$2]++}END{for(j in count) print count[j] "\t" j}' | sort -rn
    kurzum · 2013-05-03 01:45:12 0

What Others Think

I think you want to sort the lines before 'uniq -c'.
recursiverse · 417 weeks and 3 days ago
cut -d'/' -f3 <FILE> | sort | uniq -c
sputnick · 417 weeks and 3 days ago
I am not sure what your Apache logs look like, but this does not work at all on apache 2.2.9. Anyways, you have to sort first or you will get duplicate lines, and the count is worthless.
maedox · 417 weeks and 1 day ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: