Commands using tar (226)

  • Find (by regular expression) and compress (xzip) catalogs then remove source catalogs


    0
    find . -type d |awk '$1 ~ /[0-9]/ {print $0}' |xargs -P 4 -I NAME tar --remove-files -vcJf NAME.tar.xz NAME
    Glafir · 2017-08-28 08:05:29 19
  • untar in place with out creating a temporary file


    0
    ssh user@host "tar -zcf - /path/to/dir" | tar -xvz
    sandeep048 · 2017-10-07 11:37:51 18
  • I use screenflow to create and edit videos. The default storage for a single video is a folder. If I want to move that someplace, it's easier to zip up the folder and send it. If I'm making a series of short videos, I might have 10 folders. This will go through and make a single bz3 file for EACH folder.


    0
    for f in *screenflow ; do tar cvf "$f.tar.bz2" "$f"; done
    topher1kenobe · 2020-08-02 21:10:27 161
  • This is a little bash script that will take all files following the *gz pattern in the directory and apply the tar -zxvf command to them.


    -1
    for i in *.tar.gz *.tgz; do tar -zxvf $i; done
    bohemicus · 2009-02-18 10:58:12 8
  • Add z to the flags to enable compression.


    -1
    tar cf - . | (cd /new/dir; tar xvf -)
    jauderho · 2009-03-09 20:30:34 12
  • gpg's compression is as suitable as gzip's however your backups can now be encrypted. to extract use: gpg < folder.tpg | tar -xf -


    -1
    tar -cf - folder/ | gpg -c > folder.tpg
    copremesis · 2009-05-08 19:20:08 5

  • -1
    tar -C <source> -cf - . | tar -C <destination> -xf -
    Tekhne · 2009-07-10 21:16:23 4
  • Tar - Compress by excluding folders Show Sample Output


    -1
    tar -cvf /path/dir.tar /path/dir* --exclude "/path/dir/name" --exclude "/path/dir/opt"
    sandeepverma · 2009-12-15 09:48:41 3
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3
  • You don't need to create an intermediate file, just pipe the output directly to tar command and use stin as file (put a dash after the f flag).


    -1
    cat 1.tar.gz 2.tar.gz | tar zxvif -
    psychopenguin · 2010-05-09 03:50:00 5
  • Using the COPYFILE_DISABLE=true environment variable you can prevent tar from adding any ._-files to your .tar-file on Mac OS X.


    -1
    COPYFILE_DISABLE=true tar cvf newTarFile.tar Directory/
    alainkaa · 2010-07-01 09:36:48 3

  • -1
    pbzip2 -dck <bz2file> | tar xvf -
    maarten · 2010-08-16 22:16:50 3
  • The J option is a recent addition to GNU tar. The xz compression utility is required as well.


    -1
    tar cfJ tarfile.tar.xz pathnames
    jasonjgw · 2010-11-18 05:34:17 2
  • `tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)


    -1
    tar xfzO <backup_name>.tar.gz | mysql -u root <database_name>
    alecnmk · 2011-02-10 22:18:42 5
  • Sometimes you might need to have two copies of data that is in tar. You might unpack, and then copy, but if IO is slow, you might lower it by automatically writing it twice (or more times)


    -1
    mkdir copy{1,2}; gzip -dc file.tar.gz | tee >( tar x -C copy1/ ) | tar x -C copy2/
    depesz · 2011-04-14 17:02:05 5
  • Simple Compressed Backup of the /etc Linux compatible


    -1
    tar jcpf /home/[usuario]/etc-$(hostname)-backup-$(date +%Y%m%d-%H%M%S).tar.bz2 /etc
    mack · 2011-04-29 22:53:11 4
  • should do the same as command #12875, just shorter.


    -1
    tar -cf "../${PWD##*/}.tar" .
    joedhon · 2013-11-06 11:15:38 9
  • backup your files in tar archive + timestamp of backup Show Sample Output


    -1
    tar -cvf bind9-config-`date +%s`.tar *
    Fuonum · 2014-10-29 05:15:15 9
  • This is useful for sending data between 2 computers that you have shell access to. Uses tar compression during transfer. Files are compressed & uncompressed automatically. Note the trailing dash on the listening side that makes netcat listen to stdin for data. on the listening side: sudo nc -lp 2022 | sudo tar -xvf - explanation: open netcat to -l listen on -p port 2022, take the data stream and pipe to tar -x extract, -v verbose, -f using file filename - means "stdin" on the sending side: tar -cvzf - ./*| nc -w 3 name_of_listening_host 2022 explanation: compress all files in current dir using tar -c create, -v verbose, -f using file, - filename - here means "stdout" because we're tar -c instead of tar -x, -w3 wait 3 seconds on stream termination and then end the connection to the listening host name_of_listening_host, on port 2022


    -2
    on the listening side: sudo nc -lp 2022 | sudo tar -xvf - and on the sending side: tar -cvzf - ./*| nc -w 3 name_of_listening_host 2022
    smcpherson · 2009-03-27 09:59:33 12
  • Create a single tar.gz archive I know it's a very basic one, but it's one I keep forgetting. Show Sample Output


    -2
    tar -pczf archive_name.tar.gz /path/to/dir/or/file
    ryuslash · 2009-07-17 19:53:02 30
  • Using tape archive create a tar file in Stdout (-) and pipe that into a compound command to extract the tar file from Stdin at the destination. This similar to "Copy via tar pipe ...", but copies across file systems boundaries. I prefer to use cp -pr for copying within the same file system. Show Sample Output


    -2
    tar cpof - src |( cd des; tar xpof -)
    davidpotter42 · 2009-09-20 20:43:30 3
  • This script will list all the files in the tarballs present on any folder or subfolder of the provided path. The while loop is for echoing the file name of the tarball before listing the files, so the tarball can be identified


    -2
    find <path> -name "*.tgz" -or -name "*.tar.gz" | while read file; do echo "$file: "; tar -tzf $file; done
    polaco · 2009-11-10 20:39:04 36
  • The magic is performed by the parameter -t Show Sample Output


    -2
    for F in $(find ./ -name "*.tgz") ; do tar -tvzf $F ; done
    alchandia · 2009-11-11 00:50:52 3
  • This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy. You also may keep simple with, but you don't have the progress info: cp -rpf /some/directory /other/path Show Sample Output


    -2
    dir='path to file'; tar cpf - "$dir" | pv -s $(du -sb "$dir" | awk '{print $1}') | tar xpf - -C /other/path
    starchox · 2010-01-19 19:05:45 3
  • xargs deals badly with special characters (such as space, ' and "). To see the problem try this: touch important_file touch 'not important_file' ls not* | xargs rm Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.


    -2
    tar -tf <file.tar.gz> | parallel rm
    unixmonkey8046 · 2010-01-28 08:28:16 3
  • ‹ First  < 6 7 8 9 10 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Stream YouTube URL directly to MPlayer
A function for streaming youtube to mplayer. The option "-g" for youtube-dl tells it to output the direct video URL, instead of downloading the video. "-fs" tells MPlayer to go FullScreen, and "-quit" makes it less verbose. Requires: youdube-dl ( http://bitbucket.org/rg3/youtube-dl/ ) (Tested in zsh)

find previously entered commands (requires configuring .inputrc)
[Click the "show sample output" link to see how to use this keystroke.]   Meta-p is one of my all time most used and most loved features of working at the command line. It's also one that surprisingly few people know about. To use it with bash (actually in any readline application), you'll need to add a couple lines to your .inputrc then have bash reread the .inputrc using the bind command:   $ echo '"\en": history-search-forward' >> ~/.inputrc   $ echo '"\ep": history-search-backward' >> ~/.inputrc   $ bind -f ~/.inputrc     I first learned about this feature in tcsh. When I switched over to bash about fifteen years ago, I had assumed I'd prefer ^R to search in reverse. Intuitively ^R seemed better since you could search for an argument instead of a command. I think that, like using a microkernel for the Hurd, it sounded so obviously right fifteen years ago, but that was only because the older way had benefits we hadn't known about.     I think many of you who use the command line as much as I do know that we can just be thinking about what results we want and our fingers will start typing the commands needed. I assume it's some sort of parallel processing going on with the linguistic part of the brain. Unfortunately, that parallelism doesn't seem to work (at least for me) with searching the history. I realize I can save myself typing using the history shortly after my fingers have already started "speaking". But, when I hit ^R in Bash, everything I've already typed gets ignored and I have to stop and think again about what I was doing. It's a small bump in the road but it can be annoying, especially for long-time command line users. Usually M-p is exactly what I need to save myself time and trouble.     If you use the command line a lot, please give Meta-p a try. You may be surprised how it frees your brain to process more smoothly in parallel. (Or maybe it won't. Post here and let me know either way. ☺)

Using ASCII Art output on MPlayer
Not so useful. Just a cool feature.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Count number of Line for all the files in a directory recursively

list current processes writing to hard drive

Display the top ten running processes - sorted by memory usage

I finally found out how to use notify-send with at or cron
The simplest way to do it. Works for me, at least. (Why are the variables being set?)

Separates each frame of a animated gif file to a counted file, then appends the frames together into one sheet file. Useful for making sprite sheets for games.
requires imagemagick

Download all MegaTokyo strips
A simple script for download all the MegaTokyo strips from the first to the last one


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: