Part of the "atool" package.
Find all .gz files and recompress them to bz2 on the fly. No temp files. edit: forgot the double quotes! jeez!
the -a flag causes tar to automatically pick the right compressor to filter the archive through, based on the file extension. e.g. "tar -xaf archive.tar.xz" is equivalent to "tar -xJf archive.tar.xz" "tar -xaf archive.tar.gz" is equivalent to "tar -xzf archive.tar.gz" No need to remember -z is gzip, -j is bzip2, -Z is .Z, -J is xz, and so on :)
Part of the "atool" package
- recompresses all gz files to bz2 files from this point and below in the directory tree
- output shows the size of the original file, and the size of the new file. Useful.
- conceptually easier to understand than playing tricks with awk and sed.
- don't like output? Use the following line:
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
Show Sample Output
Works even if file name contains \n. Spawns one job per core.
first line is the speed the uncompressed data is read, second line is the compressed data sent over ssh. change sdb to your target drive/partition to be backed up. change pbzip -c1 to suit your compression. and ssh to your target file. don't forget to run zerofree/fstrim first! Show Sample Output
This solution is similar to [1] except that it does not have any dependency on GNU Parallel. Also, it tries to minimize the impact on the running system (using ionice and nice). [1] http://www.commandlinefu.com/commands/view/7009/recompress-all-.gz-files-in-current-directory-using-bzip2-running-1-job-per-cpu-core-in-parallel
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: