Using 7z to create archives is OK, but when you use tar, you preserve all file-specific information such as ownership, perms, etc. If that's important to you, this is a better way to do it.
Create an image of "device" and send it to another machine through the network ("target" and "port" sets the ip and port the stream will be sent to), outputting a progress bar
On the machine that will receive, compress and store the file, use:
nc -l -p <port> | 7z a <filename> -si -m0=lzma2 -mx=9 -ms=on
Optionally, add the -v4g switch at the end of the line in order to split the file every 4 gigabytes (or set another size: accepted suffixes are k, m and g).
The file will be compressed using 7z format, lzma2 algorithm, with maximum compression level and solid file activated.
The compression stage will be executed on the machine which will store the image. It was planned this way because the processor on that machine was faster, and being on a gigabit network, transfering the uncompressed image wasn't much of a problem.
Creates a solid archive with the highest possible compression (Ultra). Advantage of 7z is that it will use all the processor cores to create the archive. (Ok. at least version 9.04 does) Show Sample Output
Compress files or a directory to xz format. XZ has superior and faster compression than bzip2 in most cases. XZ is superior to 7zip format because it can save file permissions and other metadata data.
compress directory archive with xz compression, if tar doesn't have the -J option (OSX tar doesn't have -J)
Magic line will extract almost all possible archives from current folder in its own folders. Don't forget to change USER name in sudo command. sed is used to create names for folders from archive names w/o extension. You can test sed expression, used in this command:
arg='war.lan.net' ; x=$(echo $arg|sed 's/\(.*\)\..*/\1/') ; echo $x
If some archives can't be extracted, install packages:
apt-get install p7zip-full p7zip-rar
Hope this will save a lot of your time. Enjoy.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: