commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
- recompresses all gz files to bz2 files from this point and below in the directory tree
- output shows the size of the original file, and the size of the new file. Useful.
- conceptually easier to understand than playing tricks with awk and sed.
- don't like output? Use the following line:
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
Create compressed, encrypted backup from $source to $targetfile with password $key and exclude-file $excludefile
Find all .gz files and recompress them to bz2 on the fly. No temp files.
edit: forgot the double quotes! jeez!
This solution is similar to  except that it does not have any dependency on GNU Parallel. Also, it tries to minimize the impact on the running system (using ionice and nice).
is preserving creation time, modification time, permission, the directory structure, etc.
Part of the "atool" package.
Part of the "atool" package
Useful if you want to reduce PDF file size using command line by ghostscript.
If you have servers on Wide Area Network (WAN), you may experience very long transfer rates due to limited bandwidth and latency.
To speed up you transfers you need to compress the data so you will have less to transfer.
So the solution is to use a compression tools like gzip or bzip or compress before and after the data transfer.
Using ssh "-C" option is not compatible with every ssh version (ssh2 for instance).
optipng and advancecomp (for the the advpng and advdef tools) are the best FOSS tools for losslessly compressing PNGs. With the above tool chain, you can cut off as much as 20% off a PNG's file size.
This command will copy a folder tree (keeping the parent folders) through ssh. It will:
- compress the data
- stream the compressed data through ssh
- decompress the data on the local folder
This command will take no additional space on the host machine (no need to create compressed tar files, transfer it and then delete it on the host).
There is some situations (like mirroring a remote machine) where you simply cant wait for a huge time taking scp command or cant compress the data to a tarball on the host because of file system space limitation, so this command can do the job quite well.
This command performs very well mainly when a lot of data is involved in the process. If you copying a low amount of data, use scp instead (easier to type)
Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob?
It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt.
I ran into this today in that my only copy of a CSS file was "compressed" with no newlines.
I whipped this up and it converted back into nice human readable CSS :-)
It could be nicer, but it does the job.
a - archive
m5 - compression level, 0= lowest compression...1...2...3...4...5= max compression
-v5M split the output file in 5 megabytes archives, change to 700 for a CD, or 4200 for a DVD
R recursive for directories, do not use it for files
It's better to have the output of a compression already split than use the 'split' command after compression, would consume the double amount of disk space. Found at http://www.ubuntu-unleashed.com/2008/05/howto-create-split-rar-files-in-ubuntu.html