Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Checks apache's access_log file, strips the search queries and shoves them up your e-mail

Terminal - Checks apache's access_log file, strips the search queries and shoves them up your e-mail
awk '/q=/{print $11}' /var/log/httpd/access_log.4 | awk -F 'q=' '{print $2}' | sed 's/+/ /g;s/%22/"/g;s/q=//' | cut -d "&" -f 1
2009-11-22 17:13:48
User: isma
Functions: awk sed
1
Checks apache's access_log file, strips the search queries and shoves them up your e-mail

as unixmonkey7109 pointed out, first awk parse replaces three steps.

Alternatives

There is 1 alternative - vote for the best!

Terminal - Alternatives
cat /var/log/httpd/access_log | grep q= | awk '{print $11}' | awk -F 'q=' '{print $2}' | sed 's/+/ /g;s/%22/"/g;s/q=//' | cut -d "&" -f 1 | mail youremail@isp.com -s "[your-site] search strings for `date`"
2009-11-22 03:03:06
User: isma
Functions: awk cat grep sed strings
-2

It's not a big line, and it *may not* work for everybody, I guess it depends on the detail of access_log configuration in your httpd.conf. I use it as a prerotate command for logrotate in httpd section so it executes before access_log rotation, everyday at midnight.

Know a better way?

If you can do better, submit your command here.

What others think

It is possible to use one instance awk to pull the search strings and a little bit of python to unescape them so that it catches more than just %22.

awk '/q=/{FS="q=";split($2,a,"&");b=a[1];gsub(/".*$|\+/," ",b);if(b!="-")print b}' /var/log/httpd/access_log|python -c 'import sys,urllib2;print urllib2.unquote(sys.stdin.read().strip())'
Comment by eightmillion 243 weeks and 1 day ago

in that case, why don't do everything in Python..?

Comment by unixmonkey7109 243 weeks and 1 day ago

Because it's longer and uglier and more error prone:

python -c 'import urllib,re;print "\n".join(map(lambda x:urllib.unquote(re.findall(r"q=.*?[&\"]",x.split()[10])[0][2:-1].replace("+"," ")),filter(lambda x:"q=" in x,open("/var/log/httpd/access_log").readlines())))'
Comment by eightmillion 243 weeks and 1 day ago

no its not. Do it with a Python script, not as one liner like that. Otherwise, use (g)awk but no need to create extra pipes to cut+grep+sed.

Comment by unixmonkey7109 243 weeks and 1 day ago

here's a sample matching line from access_log

190.132.248.102 - - [23/Nov/2009:10:00:22 -0200] "GET /2009/10/14/ceibal-reflexiones/ HTTP/1.1" 200 30418 "

the output is

ceibal reflexiones

Comment by isma 243 weeks and 1 day ago

oh.. url filter .. lol

pastebin.com/fa34b7a1

Comment by isma 243 weeks and 1 day ago

@isma, ok, but i don't see any "q=" in that sample. please show an actual sample where there is "q=" in the line.

Comment by unixmonkey7109 243 weeks and 1 day ago

@unixmonkey7109 uh... sorry, I think my comments were unclear. The first one was filtered by commandlinefu, so it didn't show up completely. The sample line (which contains a q= as you will see) is in pastebin.com/fa34b7a1 . I'm sorry this is turning so confusing. It's actually a good idea for monitoring your SEO tasks, and thanks to you, the command was nicely reduced (and, believe me, I learned a lot in that process). So thanks again and maybe you can use it on your own access_log.

PLEASE NOTICE (as I just realized): your httpd.conf should have the next line:

CustomLog "/var/log/httpd/access_log" combined

like that, suffixed by the word "combined" instead of "common" so it would bring the referer.

Comment by isma 243 weeks ago

Your point of view

You must be signed in to comment.

Related sites and podcasts