Commands by jaymzcd (4)


  • 2
    for file in *.iso; do mkdir `basename $file | awk -F. '{print $1}'`; sudo mount -t iso9660 -o loop $file `basename $file | awk -F. '{print $1}'`; done
    jaymzcd · 2009-10-17 20:07:31 1
  • If you use newsgroups then you'll have come across split files before. Joining together a whole batch of them can be a pain so this will do the whole folder in one. Show Sample Output


    2
    for file in *.001; do NAME=`echo $file | cut -d. -f1,2`; cat "$NAME."[0-9][0-9][0-9] > "$NAME"; done
    jaymzcd · 2009-07-29 10:04:26 1
  • This prints a summary of your referers from your logs as long as they occurred a certain number of times (in this case 500). The grep command excludes the terms, I add this in to remove results Im not interested in. Show Sample Output


    1
    awk -F\" '{print $4}' *.log | grep -v "eviljaymz\|\-" | sort | uniq -c | awk -F\ '{ if($1>500) print $1,$2;}' | sort -n
    jaymzcd · 2009-05-05 22:21:04 0
  • I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter). For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches). Show Sample Output


    0
    Q="reddit|digg"; F=*.log; awk -F\" '{print $4}' $F | egrep $Q | wc -l
    jaymzcd · 2009-05-05 21:51:16 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Encode a string using ROT47
This command will encode a string using the ROT47 cipher.

Convert a string to "Title Case"

Jump up to any directory above the current one
Usage: upto directory

limit the cdrom driver to a specified speed
this command limit the speed to 8 until next eject of your cdrom disc , can be usefulll when you don't want to listen the sound of your cdrom driver .

Copies currently played song in Audacious to selected directory
Maybe it could work for any music player if you change "audacious2" with the string you see in `ps aux` for your player. Needs testing in different systems.

Split huge file into DVD+R size chunks for burning
Real DVD+R size is 4700372992 bytes, but I round down a little to be safe. To reconstitute use cat. "cat file.img.gz.aa file.img.gz.ab ..... > file.img.gz"

Adding Prefix to File name

Blue Matrix
Same as original, but works in bash

scroll file one line at a time (w/only UNIX base utilities)
usage examples ls largedir |rd lynx -dump largewebsite.com |rd rd < largelogfile

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: