Commands by unixmonkey26318 (1)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Quickly create an alias for changing into the current directory
Put the function in your .bashrc and use "map [alias]" to create the alias you want. Just be careful to not override an existing alias.

Kill all processes belonging to a user
This is a 'killall' command equivalent where it is not available. Prior to executing it, set the environment variable USERNAME to the username, whose processes you want to kill or replace the username with the $USERNAME on the command above. Side effect: If any processes from other users, are running with a parameter of $USERNAME, they will be killed as well (assuming you are running this as root user) [-9] in square brackets at the end of the command is optional and strongly suggested to be your last resort. I do not like to use it as the killed process leaves a lot of mess behind.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Rotate a set of photos matching their EXIF data.
You need jhead package.

find and delete empty dirs, start in current working dir

diff current vi buffer edits against original file

find .txt files inside a directory and replace every occurrance of a word inside them via sed

unbuffered tcpdump
Sometimes the question comes up: How to get unbuffered tcpdump output into the next program in the pipe? i.e. if your OS forces you to wait for the buffer to fill before the next program sees any of the output If you use -Uw- then you can't use -A (or -X or -XX) at the same time. When the question comes up, I've never seen anyone suggest this simple solution: chaining 2 tcpdump instances.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Auto-log commands
A quick alias I use right before logging into a server so that I have a log of the transactions as well as the ability to re-connect from another computer. Useful for when your boss says "what commands did you run again on that server?" and you had already closed the terminal ;) I wrapped it in a script now, with more features, but this is the heart of it. Never leave home without it.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: