Commands using awk (1,418)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Check the status of a network interface
no need for installing mii-tools, all generic tools working under archlinux.

Convert ascii string to hex
If you're going to use od, here's how to suppress the labels at the beginning. Also, it doesn't output the \x, hence the sed command at the end. Remove it for space separated hex values instead

Startup a VPN connection through command line

tar the current directory wihtout the absolute path
tars the current directory (and its children) in an archive of the same name (plus ".tar" :)) in the parent directory without the absolute path, so that when the archive is extracted, only the current directory name is created for the path. Assumes bash/zsh.

Display IPs accessing your Apache webserver.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

aplay some whitenoise
Plays whitenoise from /dev/urandom.

Function that outputs dots every second until command completes
Very useful in shell scripts because you can run a task nicely in the background using job-control and output progress until it completes. Here's an example of how I use it in backup scripts to run gpg in the background to encrypt an archive file (which I create in this same way). $! is the process ID of the last run command, which is saved here as the variable PI, then sleeper is called with the process id of the gpg task (PI), and sleeper is also specified to output : instead of the default . every 3 seconds instead of the default 1. So a shorter version would be sleeper $!; The wait is also used here, though it may not be needed on your system. $ echo ">>> ENCRYPTING SQL BACKUP" $ gpg --output archive.tgz.asc --encrypt archive.tgz 1>/dev/null & $ PI=$!; sleeper $PI ":" 3; wait $PI && rm archive.tgz &>/dev/null Previously to get around the $! not always being available, I would instead check for the existance of the process ID by checking if the directory /proc/$PID existed, but not everyone uses proc anymore. That version is currently the one at http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html but I plan on upgrading to this new version soon.

Monitoring file handles used by a particular process
-r : repeat mode


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: