Commands by greatodensraven (1)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

All what exists in dir B and not in dir A will be copied from dir B to new or existing dir C
Assumed dir A, B, C are subdirs of the current dir Exact syntax of the command is: rsync -v -r --size-only --compare-dest=/path_to_A/A/ /path_to_B/B/ /path_to_C/C/ (do not omit end-slashes, since that would copy only the names and not the contents of subdirs of dir B to dir C) You can replace --size-only with --checksum for more thorough file differences validation Useful switch: -n, --dry-run perform a trial run with no changes made

Block all IP addresses and domains that have attempted brute force SSH login to computer
I use iptables. To rate limit connections. Very easy and no ban lists to manage.

Quickly analyze apache logs for top 25 most common IP addresses.
This command is much quicker than the alternative of "sort | uniq -c | sort -n".

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Send an http HEAD request w/curl

generate file list modified since last commit and export to tar file
################################################################################ # get all modified files since last commit and zip them to upload to live server ################################################################################ # delete previous tar output file rm mytarfile.tar -rf #rm c:/tarOutput/*.* -rf # get last commit id and store in variable declare RESULT=$(git log --format="%H" | head -n1) # generate file list and export to tar file git diff-tree -r --no-commit-id --name-only --diff-filter=ACMRT $RESULT | xargs tar -rf mytarfile.tar # extract tar files to specified location tar -xf mytarfile.tar -C c:/tarOutput

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Update twitter via curl

Get NFL/MLB Scores/Time
change the nfl in the url to mlb or nba to get those score/times as well

Install pip with Proxy
Installs pip packages defining a proxy


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: