Commands tagged tar (77)

  • Part of the "atool" package


    0
    aunpack foo.tar.bz2
    tebeka · 2012-07-11 22:53:18 0

  • 0
    GZIP="--rsyncable" tar -czf something.tgz /something
    dash · 2012-04-24 19:06:56 1
  • If you vim a compressed file it will list all archive content, then you can pickup any of them for editing and saving. There you have the modified archive without any extra step. It supports many file types such as tar.gz, tgz, zip, etc.


    5
    vim some-archive.tar.gz
    ktonga · 2012-04-20 02:37:28 1

  • 0
    tar czf git_mods_circa_dec23.tgz --files-from <(git ls-files -m)
    jemptymethod · 2011-12-23 15:31:21 0
  • This improves on #9892 by compressing the directory on the remote machine so that the amount of data transferred over the network is much smaller. The command uses ssh(1) to get to a remote host, uses tar(1) to archive and compress a remote directory, prints the result to STDOUT, which is written to a local file. In other words, we are archiving and compressing a remote directory to our local box.


    20
    ssh user@host "tar -zcf - /path/to/dir" > dir.tar.gz
    __ · 2011-12-16 05:48:38 2

  • 0
    ssh user@host "tar -czf - /path/to/dir" > dir.tar.gz
    mossholderm · 2011-12-15 04:12:54 0
  • The command uses ssh(1) to get to a remote host, uses tar(1) to archive a remote directory, prints the result to STDOUT, which is piped to gzip(1) to compress to a local file. In other words, we are archiving and compressing a remote directory to our local box.


    7
    ssh user@host "tar -cf - /path/to/dir" | gzip > dir.tar.gz
    atoponce · 2011-12-14 15:54:57 7
  • If archive has leading directory level same as archive name and you want to strip it, this command is for you.


    12
    tar -xaf archive.tar.gz --strip-components=1
    sirex · 2011-11-29 07:38:19 0
  • the -a flag causes tar to automatically pick the right compressor to filter the archive through, based on the file extension. e.g. "tar -xaf archive.tar.xz" is equivalent to "tar -xJf archive.tar.xz" "tar -xaf archive.tar.gz" is equivalent to "tar -xzf archive.tar.gz" No need to remember -z is gzip, -j is bzip2, -Z is .Z, -J is xz, and so on :)


    0
    tar -caf some_dir.tar.xz some_dir
    thetrivialstuff · 2011-06-09 19:00:06 0
  • Similar, but uses tarball instead of zip file


    -1
    git archive HEAD | gzip > ~/Dropbox/archive.tar.gz
    tamouse · 2011-06-08 09:44:07 0
  • Simple Compressed Backup of the /etc Linux compatible


    -1
    tar jcpf /home/[usuario]/etc-$(hostname)-backup-$(date +%Y%m%d-%H%M%S).tar.bz2 /etc
    mack · 2011-04-29 22:53:11 1
  • Sometimes you might need to have two copies of data that is in tar. You might unpack, and then copy, but if IO is slow, you might lower it by automatically writing it twice (or more times)


    -1
    mkdir copy{1,2}; gzip -dc file.tar.gz | tee >( tar x -C copy1/ ) | tar x -C copy2/
    depesz · 2011-04-14 17:02:05 0
  • The result of this command is a tar with all files that have been modified/added since revision 1792 until HEAD. This command is super useful for incremental releases.


    0
    svn diff -r 1792:HEAD --summarize | awk '{if ($1 != "D") print $2}'| xargs -I {} tar rf incremental_release.tar {}
    windfold · 2011-04-05 15:00:49 0

  • -1
    cp -av source dest
    Vilemirth · 2011-02-19 18:43:49 0
  • This may be listed already but this command is useful to untar a specific directory to a different server.


    0
    cat tarfile.tar.gz | ssh server.com " cd /tmp; tar xvzf - directory/i/want"
    alf · 2011-02-11 17:10:01 1
  • `tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)


    -1
    tar xfzO <backup_name>.tar.gz | mysql -u root <database_name>
    alecnmk · 2011-02-10 22:18:42 1
  • A quick find command to identify all TAR files in a given path, extract a list of files contained within the tar, then search for a given string in the filelist. Returns to the user as a list of TAR files found (enclosed in []) followed by any matching files that exist in that archive. TAR can easily be swapped for JAR if required. Show Sample Output


    1
    find . -type f -name "*.tar" -printf [%f]\\n -exec tar -tf {} \; | grep -iE "[\[]|<filename>"
    andrewtayloruk · 2011-01-06 13:01:38 0
  • If you want to decompress the files from an archive to current directory by stripping all directory paths, use --transform option to strip path information. Unfortunately, --strip-components option is good if the target files have same and constant depth of folders. The idea was taken from http://www.unix.com/solaris/145941-how-extract-files-tar-file-without-creating-directories.html Show Sample Output


    1
    tar --transform 's#.*/\([^/]*\)$#\1#' -xzvf test-archive.tar.gz
    alperyilmaz · 2010-11-29 23:16:57 1
  • This is freaking sweet!!! Here is the full alias, (I didn't want to cause display problems on commandlinefu.com's homepage): alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); S=$SECONDS; tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$"F "$D" && logger -s "Tarred $D to $F in $(($SECONDS-$S)) seconds" ) & )' Creates a .tgz archive of whatever directory it is run from, in the background, detached from current shell so if you logout it will still complete. Also, you can run this as many times as you want, if the archive .tgz already exists, it just moves it to a numbered backup '--backup=numbered'. The coolest part of this is the transformation performed by tar and sed so that the archive file names are automatically created, and when you extract the archive file it is completely safe thanks to the transform command. If you archive lets say /home/tombdigger/new-stuff-to-backup/ it will create the archive /home/#home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz Then when you extract it, like tar -xvzf #home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz instead of overwriting an existing /home/tombdigger/new-stuff-to-backup/ directory, it will extract to /home/tombdigger/new-stuff-to-backup.2010-11-18/ Basically, the tar archive filename is the PWD with all '/' replaced with '#', and the date is appended to the name so that multiple archives are easily managed. This example saves all archives to your $HOME/archive-name.tgz, but I have a $BKDIR variable with my backup location for each shell user, so I just replaced HOME with BKDIR in the alias. So when I ran this in /opt/askapache/SOURCE/lockfile-progs-0.1.11/ the archive was created at /askapache-bk/#opt#askapache#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz Upon completion, uses the universal logger tool to output its completion to syslog and stderr (printed to your terminal), just remove that part if you don't want it, or just remove the '-s ' option from logger to keep the logs only in syslog and not on your terminal. Here's how my syslog server recorded this.. 2010-11-18T00:44:13-05:00 gravedigger.askapache.com (127.0.0.5) [user] [notice] (logger:) Tarred /opt/askapache/SOURCE/lockfile-progs-0.1.11 to /askapache-bk/tarred/#opt#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz in 4 seconds Caveats Really this is very robust and foolproof, the only issues I ever have with it (I've been using this for years on my web servers) is if you run it in a directory and then a file changes in that directory, you get a warning message and your archive might have a problem for the changed file. This happens when running this in a logs directory, a temp dir, etc.. That's the only issue I've ever had, really nothing more than a heads up. Advanced: This is a simple alias, and very useful as it works on basically every linux box with semi-current tar and GNU coreutils, bash, and sed.. But if you want to customize it or pass parameters (like a dir to backup instead of pwd), check out this function I use.. this is what I created the alias from BTW, replacing my aa_status function with logger, and adding $SECONDS runtime instead of using tar's --totals function tarred () { local GZIP='--fast' PWD=${1:-`pwd`} F=$(date +${BKDIR}/%m-%d-%g-%H%M-`sed -u 's/[\/\ ]/#/g' [[ ! -r "$PWD" ]] && echo "Bad permissions for $PWD" 1>&2 && return 2; ( ( tar --totals --ignore-failed-read --transform "s@^${PWD%/*}@`date +${PWD%/*}.%m-%d-%g`@S" -czPf $F $PWD && aa_status "Completed Tarp of $PWD to $F" ) & ) } #From my .bash_profile http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    8
    alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$F" "$D" &>/dev/null ) & )'
    AskApache · 2010-11-18 06:24:34 0
  • This Anti-TarBomb function makes it easy to unpack a .tar.gz without worrying about the possibility that it will "explode" in your current directory. I've usually always created a temporary folder in which I extracted the tarball first, but I got tired of having to reorganize the files afterwards. Just add this function to your .zshrc / .bashrc and use it like this; atb arch1.tar.gz and it will create a folder for the extracted files, if they aren't already in a single folder. This only works for .tar.gz, but it's very easy to edit the function to suit your needs, if you want to extract .tgz, .tar.bz2 or just .tar. More info about tarbombs at http://www.linfo.org/tarbomb.html Tested in zsh and bash. UPDATE: This function works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in zsh (not working in bash): atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t(ar.gz||ar.bz2||gz||bz||ar)} && tar xf $1 -C ${1%.t(ar.gz||ar.bz2||gz||bz||ar)}; fi ;} UPDATE2: From the comments; bepaald came with a variant that works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in bash: atb() {shopt -s extglob ; l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)} && tar xf $1 -C ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)}; fi ; shopt -u extglob} Show Sample Output


    10
    atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.tar.gz} && tar xf $1 -C ${1%.tar.gz}; fi ;}
    elfreak · 2010-10-16 05:50:32 5
  • The following command finds all the files not modified in the last 5 days under /protocollo/paflow directory and creates an archive files under /var/dump-protocollo in the format of ddmmyyyy_archive.tar


    0
    find /protocollo/paflow -type f -mtime +5 | xargs tar -cvf /var/dump-protocollo/`date '+%d%m%Y'_archive.tar`
    0disse0 · 2010-06-29 12:43:30 0
  • This is a shortcut to tar up all files matching a wildcard. Tar doesn't have the --include (apparently).


    1
    tar -czf ../header.tar.gz $(find . -name *.h)
    unixmonkey10524 · 2010-06-27 23:44:48 1
  • Here how to recover the remote backup over ssh Show Sample Output


    8
    ssh user@host "cat /path/to/backup/backupfile.tar.bz2" |tar jpxf -
    mack · 2010-03-24 01:35:28 2
  • Execute it from the source host, where the source files you wish backup resides. With the minus '-' the tar command deliver the compressed output to the standar output and, trough over the ssh session to the remote host. On the other hand the backup host will be receive the stream and read it from the standar input sending it to the /path/to/backup/backupfile.tar.bz2 Show Sample Output


    13
    tar jcpf - [sourceDirs] |ssh user@host "cat > /path/to/backup/backupfile.tar.bz2"
    mack · 2010-03-24 01:29:25 0
  • xargs deals badly with special characters (such as space, ' and "). In this case if you have a file called '12" record'. Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem. Both solutions work bad if the number of files is more than the allowed line length of the shell.


    -2
    svn st | cut -c 9- | parallel -X tar -czvf ../backup.tgz
    unixmonkey8046 · 2010-01-28 11:43:16 0
  •  < 1 2 3 4 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: