Commands by leucos (1)

  • While edtiing a project under git, it is sometimes nice to sync changes immediately to a test machine. This command will take care of this if you have inotifywait installed on the developement machine. Note the -R (relative) in rsync. with rsync foo/bar/baz user@host:dest/dir/ it will put 'baz' in dest/dir/foo/bar/ which is what we want. this can be turned into a function for additionnal flexibility : function gitwatch() { if [ -z $1 ]; then echo "You must provide a rsync destination" return fi while true; do rsync -vR $(git ls-files | inotifywait -q -e modify -e attrib -e close_write --fromfile - --format '%w') $1 done }


    0
    while true; do rsync -vR $(git ls-files | inotifywait -q -e modify -e attrib -e close_write --fromfile - --format '%w') user@host:dest/dir/; done
    leucos · 2014-01-21 10:31:41 1

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Getting the last argument from the previous command

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Test a serial connection
If the connection works you should see a "hello" on host A. If not: check your cabeling etc :-)

find duplicate processes
This command will allow to search for duplicate processes and sort them by their run count. Note that if there are same processes run by different users you'll see only one user in the result line, so you'll need to do: $ ps aux | grep to see all users that run this command.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

dump database from postgresql to a file

delete all leading and trailing whitespace from each line in file

Find directory depth
Returns a the directory depth.

Delicious search with human readable output
You can install filterous with $ sudo apt-get install libxslt1-dev; sudo easy_install -U filterous

Function to change prompt
Bash function to change your default prompt to something simpler and restore it to normal afterwards.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: