find ./ -iname "*.dmp" -maxdepth 0 -type f -exec tar czvf {}.tar.gz --remove-files {} \; \;

Archive tar.gz archives all files (with extension filter) individually from an location

This command will : -Archive all *.dmp files individually (one file per archive) from current directory . -Delete original file after has been compressed.

2011-09-07 08:36:29

These Might Interest You

  • the -a flag causes tar to automatically pick the right compressor to filter the archive through, based on the file extension. e.g. "tar -xaf archive.tar.xz" is equivalent to "tar -xJf archive.tar.xz" "tar -xaf archive.tar.gz" is equivalent to "tar -xzf archive.tar.gz" No need to remember -z is gzip, -j is bzip2, -Z is .Z, -J is xz, and so on :)

    tar -caf some_dir.tar.xz some_dir
    thetrivialstuff · 2011-06-09 19:00:06 0
  • Avoids creating useless directory entries in archive, and sorts files by (roughly) extension, which is likely to group similar files together for better compression. 1%-5% improvement.

    find . \! -type d | rev | sort | rev | tar c --files-from=- --format=ustar | bzip2 --best > a.tar.bz2
    pornel · 2009-12-20 14:04:39 0
  • Magic line will extract almost all possible archives from current folder in its own folders. Don't forget to change USER name in sudo command. sed is used to create names for folders from archive names w/o extension. You can test sed expression, used in this command: arg='' ; x=$(echo $arg|sed 's/\(.*\)\..*/\1/') ; echo $x If some archives can't be extracted, install packages: apt-get install p7zip-full p7zip-rar Hope this will save a lot of your time. Enjoy.

    for ARG in * ; do sudo -u USER 7z x -o"$(echo $ARG|sed 's/\(.*\)\..*/\1/')" "$ARG" ; done
    n158 · 2012-12-31 19:47:24 0
  • Compresses each file individually, creating a $fileneame.tar.gz and removes the uncompressed version, usefull if you have lots of files and don't want 1 huge archive containing them all. you could replace ls with ls *.pdf to just perform the action on pdfs for example.

    ls | while read filename; do tar -czvf "$filename".tar.gz "$filename"; rm "$filename"; done
    Thingymebob · 2010-03-29 08:10:38 2
  • The only zipped version of an album available for download is the lossy mp3 version. To download lossless files, because of their size, you must download them individually. This command scrapes the page for all the FLAC (or also SHN) files.

    wget -rc -A.flac --tries=5
    meunierd · 2010-01-20 07:36:25 0
  • From the cwd, recursively find all rar files, extracting each rar into the directory where it was found, rather than cwd. A nice time saver if you've used wget or similar to mirror something, where each sub dir contains an rar archive. Its likely this can be tuned to work with multi-part archives where all parts use ambiguous .rar extensions but I didn't test this. Perhaps unrar would handle this gracefully anyway?

    find . -name '*.rar' -execdir unrar e {} \;
    kyle0r · 2012-09-27 02:27:03 0

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: