I find this terribly useful for grepping through a file, looking for just a block of text. There's "grep -A # pattern file.txt" to see a specific number of lines following your pattern, but what if you want to see the whole block? Say, the output of "dmidecode" (as root):
dmidecode | awk '/Battery/,/^$/'
Will show me everything following the battery block up to the next block of text. Again, I find this extremely useful when I want to see whole blocks of text based on a pattern, and I don't care to see the rest of the data in output. This could be used against the '/etc/securetty/user' file on Unix to find the block of a specific user. It could be used against VirtualHosts or Directories on Apache to find specific definitions. The scenarios go on for any text formatted in a block fashion. Very handy.
Using awk, find duplicates in a file without sorting, which reorders the contents. awk will not reorder them, and still find and remove duplicates which you can then redirect into another file.
Written for linux, the real example is how to produce ascii text graphs based on a numeric value (anything where uniq -c is useful is a good candidate). Show Sample Output
Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages.
For some reason sed gets stuck on OS X, so here's a Perl version for the Mac:
curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*<name>(.*)<\/name>.*$/$2 - $1/'
If you want to see the name of the last person, who added a message to the conversation, change the greediness of the operators like this:
curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*?<name>(.*?)<\/name>.*$/$2 - $1/'
Show Sample Output
Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It's a more practical solution to block all and allow in specifics however, there are many who don't or can't do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print "add deny ip from "$1" to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules Show Sample Output
Print all columns except the 1st and 3rd.
You have an external USB drive or key. Apply this command (using the file path of anything on your device) and it will simulate the unplug of this device. If you just want the port, just type : echo $(sudo lshw -businfo | grep -B 1 -m 1 $(df "/path/to/file" | tail -1 | awk '{print $1}' | cut -c 6-8) | head -n 1 | awk '{print $1}' | cut -c 5- | tr ":" "-") Show Sample Output
What happens here is we tell tar to create "-c" an archive of all files in current dir "." (recursively) and output the data to stdout "-f -". Next we specify the size "-s" to pv of all files in current dir. The "du -sb . | awk ?{print $1}?" returns number of bytes in current dir, and it gets fed as "-s" parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way "pv" knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish. Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/ Show Sample Output
Use this command to find out a list of committers sorted by the frequency of commits. Show Sample Output
Show the number of failed tries of login per account. If the user does not exist it is marked with *. Show Sample Output
I save this to bin/iptrace and run "iptrace ipaddress" to get the Country, City and State of an ip address using the http://ipadress.com service. I add the following to my script to get a tinyurl of the map as well: URL=`lynx -dump http://www.ip-adress.com/ip_tracer/?QRY=$1|grep details|awk '{print $2}'` lynx -dump http://tinyurl.com/create.php?url=$URL|grep tinyurl|grep "19. http"|awk '{print $2}'
This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.
This uses awk to grab the IP address from each request and then sorts and summarises the top 10.
Parses /etc/group to "dot" format and pases it to "display" (imagemagick) to show a usefull diagram of users and groups (don't show empty groups).
When you fill a formular with Firefox, you see things you entered in previous formulars with same field names. This command list everything Firefox has registered. Using a "delete from", you can remove anoying Google queries, for example ;-)
Takes a input file (count.txt) that looks like: 1 2 3 4 5 It will add/sum the first column of numbers.
It's not my code, but I found it useful to know how many open connections per request I have on a machine to debug connections without opening another http connection for it. You can also decide to sort things out differently then the way it appears in here. Show Sample Output
This loops through all tables and changes their collations to UTF8. You should backup beforehand though in case some data is lost in the process.
Busiest seconds:
cat /var/log/secure.log | awk '{print substr($0,0,15)}' | uniq -c | sort -nr | awk '{printf("\n%s ",$0) ; for (i = 0; i<$1 ; i++) {printf("*")};}'
Show Sample Output
find all computer connected to my host through TCP connection. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: