Commands by jaymzcd (4)


  • 2
    for file in *.iso; do mkdir `basename $file | awk -F. '{print $1}'`; sudo mount -t iso9660 -o loop $file `basename $file | awk -F. '{print $1}'`; done
    jaymzcd · 2009-10-17 20:07:31 0
  • If you use newsgroups then you'll have come across split files before. Joining together a whole batch of them can be a pain so this will do the whole folder in one. Show Sample Output


    2
    for file in *.001; do NAME=`echo $file | cut -d. -f1,2`; cat "$NAME."[0-9][0-9][0-9] > "$NAME"; done
    jaymzcd · 2009-07-29 10:04:26 1
  • This prints a summary of your referers from your logs as long as they occurred a certain number of times (in this case 500). The grep command excludes the terms, I add this in to remove results Im not interested in. Show Sample Output


    1
    awk -F\" '{print $4}' *.log | grep -v "eviljaymz\|\-" | sort | uniq -c | awk -F\ '{ if($1>500) print $1,$2;}' | sort -n
    jaymzcd · 2009-05-05 22:21:04 0
  • I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter). For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches). Show Sample Output


    0
    Q="reddit|digg"; F=*.log; awk -F\" '{print $4}' $F | egrep $Q | wc -l
    jaymzcd · 2009-05-05 21:51:16 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Top 10 Memory Consuming Processes

copy partition table from /dev/sda to /dev/sdb

Delete all but latest file in a directory
Useful for deleting old unused log files.

check open ports without netstat or lsof

vim insert current filename
insert filename Normal mode: "%p Insert mode: %

Use Perl like grep
If you've ever tried "grep -P" you know how terrible it is. Even the man page describes it as "highly experimental". This function will let you 'grep' pipes and files using Perl syntax for regular expressions. The first argument is the pattern, e.g. '/foo/'. The second argument is a filename (optional).

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Stores the certificate expiration date on the variable A
Stores the certificate expiration date on the variable A

Get all these commands in a text file with description.
I tried out on my Mac, jot to generate sequence ( 0,25,50,..), you can use 'seq' if it is linux to generate numbers, need curl installed on the machine, then it rocks. @Satya

Query wikipedia over DNS


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: