Commands by pdwalker (1)

  • - recompresses all gz files to bz2 files from this point and below in the directory tree - output shows the size of the original file, and the size of the new file. Useful. - conceptually easier to understand than playing tricks with awk and sed. - don't like output? Use the following line: for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done Show Sample Output


    0
    for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && echo -n `ls -s $gz` "... " && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz && echo `ls -s $d/$f.bz2`; done
    pdwalker · 2014-03-13 08:36:24 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Convert epoch date to human readable date format in a log file.

Delete recursively only empty folders on present dir

copy working directory and compress it on-the-fly while showing progress
What happens here is we tell tar to create "-c" an archive of all files in current dir "." (recursively) and output the data to stdout "-f -". Next we specify the size "-s" to pv of all files in current dir. The "du -sb . | awk ?{print $1}?" returns number of bytes in current dir, and it gets fed as "-s" parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way "pv" knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish. Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/

Count accesses per domain
count the times a domain appears on a file which lines are URLs in the form http://domain/resource.

Resize an image to at least a specific resolution
This command will resize an image (keeping the aspect ratio) to a specific resolution, meaning the resulting image will never be smaller than this resolution. For example, if we have a 2048x1000 image, the output would be 1229x600, not 1024x600 or 1024x500. Same thing for the height, if the image is 2000x1200, the output would be 1024x614.

Clean way of re-running bash startup scripts.
This replaces the current bash session with a new bash session, run as an interactive non-login shell... useful if you have changed /etc/bash.bashrc, or ~/.bashrc If you have changed a startup script for login shells, use $ exec bash -l Suitable for re-running /etc/profile, ~/.bash_login and ~/.profile. edit: chinmaya points out that $ env - HOME=$HOME TERM=$TERM bash -s "exec bash -l" will clear any shell variables which have been set... since this verges on unwieldy, might want to use $ alias bash_restart='env - HOME=$HOME TERM=$TERM bash -s "exec bash -l"'

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

useful tail on /var/log to avoid old logs or/and gzipped files
with discard wilcards in bash you can "tail" newer logs files to see what happen, any error, info, warn...


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: