Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using tar from sorted by
Terminal - Commands using tar - 203 results
tar xfzO <backup_name>.tar.gz | mysql -u root <database_name>
2011-02-10 22:18:42
User: alecnmk
Functions: tar
-1

`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)

tar -cvf - $DIR_TO_BACKUP | tee >(md5sum > backup_md5.txt) > /dev/st0 && mt -f /dev/nst0 bsfm 1 && md5sum -c backup_md5.txt < /dev/st0
2011-01-27 20:57:36
User: bugmenot
Functions: md5sum mt tar tee
0

Backups $DIR_TO_BACKUP into tape, creating on the fly a MD5SUM file of the backup.

Then rewinds one record on tape and checks if it's well written.

find . -type f -mtime +100 -exec tar rvf my.tar --remove-files {} \;
2011-01-26 06:13:19
User: cp
Functions: find tar
4

tar does not have a -mtime option as find. tar appends all the file to an existing tar file.

wget http://URL/FILE.tar.gz -O - | tar xfz -
2011-01-18 12:17:16
Functions: tar wget
16

This will uncompress the file while it's being downloaded which makes it much faster

find . -type f -name "*.tar" -printf [%f]\\n -exec tar -tf {} \; | grep -iE "[\[]|<filename>"
2011-01-06 13:01:38
Functions: find grep tar
Tags: find grep tar
1

A quick find command to identify all TAR files in a given path, extract a list of files contained within the tar, then search for a given string in the filelist. Returns to the user as a list of TAR files found (enclosed in []) followed by any matching files that exist in that archive. TAR can easily be swapped for JAR if required.

git archive --format=tar HEAD | (cd /var/www/ && tar xf -)
2010-12-23 05:50:28
User: ox0spy
Functions: cd tar
4

in fact, I want to know, how to only get the modified files.

tar -cf ~/out.tar --no-recursion --files-from <(find . -type d)
tar -czvf - /src/dir | ssh remotehost "(cd /dst/dir ; tar -xzvf -)"
Server side: while true; do tar cvzf - ./* | nc -l 2000; done, client side: nc localhost 2000 | tar xvzf -
cd /usr/src ; wget http://www.rarlab.com/rar/unrarsrc-4.0.2.tar.gz ; tar xvfz unrarsrc-4.0.2.tar.gz ; cd unrar ; ln -s makefile.unix Makefile ; make clean ; make ; make install
tar --transform 's#.*/\([^/]*\)$#\1#' -xzvf test-archive.tar.gz
2010-11-29 23:16:57
User: alperyilmaz
Functions: tar
Tags: tar
1

If you want to decompress the files from an archive to current directory by stripping all directory paths, use --transform option to strip path information. Unfortunately, --strip-components option is good if the target files have same and constant depth of folders.

The idea was taken from http://www.unix.com/solaris/145941-how-extract-files-tar-file-without-creating-directories.html

ssh root@host1 ?cd /somedir/tocopy/ && tar -cf ? .? | ssh root@host2 ?cd /samedir/tocopyto/ && tar -xf -?
alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$F" "$D" &>/dev/null ) & )'
2010-11-18 06:24:34
User: AskApache
Functions: alias date tar
7

This is freaking sweet!!! Here is the full alias, (I didn't want to cause display problems on commandlinefu.com's homepage):

alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); S=$SECONDS; tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$"F "$D" && logger -s "Tarred $D to $F in $(($SECONDS-$S)) seconds" ) & )'

Creates a .tgz archive of whatever directory it is run from, in the background, detached from current shell so if you logout it will still complete. Also, you can run this as many times as you want, if the archive .tgz already exists, it just moves it to a numbered backup '--backup=numbered'. The coolest part of this is the transformation performed by tar and sed so that the archive file names are automatically created, and when you extract the archive file it is completely safe thanks to the transform command.

If you archive lets say /home/tombdigger/new-stuff-to-backup/ it will create the archive /home/#home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz Then when you extract it, like tar -xvzf #home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz instead of overwriting an existing /home/tombdigger/new-stuff-to-backup/ directory, it will extract to /home/tombdigger/new-stuff-to-backup.2010-11-18/

Basically, the tar archive filename is the PWD with all '/' replaced with '#', and the date is appended to the name so that multiple archives are easily managed. This example saves all archives to your $HOME/archive-name.tgz, but I have a $BKDIR variable with my backup location for each shell user, so I just replaced HOME with BKDIR in the alias.

So when I ran this in /opt/askapache/SOURCE/lockfile-progs-0.1.11/ the archive was created at /askapache-bk/#opt#askapache#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz

Upon completion, uses the universal logger tool to output its completion to syslog and stderr (printed to your terminal), just remove that part if you don't want it, or just remove the '-s ' option from logger to keep the logs only in syslog and not on your terminal.

Here's how my syslog server recorded this..

2010-11-18T00:44:13-05:00 gravedigger.askapache.com (127.0.0.5) [user] [notice] (logger:) Tarred /opt/askapache/SOURCE/lockfile-progs-0.1.11 to /askapache-bk/tarred/#opt#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz in 4 seconds

Caveats

Really this is very robust and foolproof, the only issues I ever have with it (I've been using this for years on my web servers) is if you run it in a directory and then a file changes in that directory, you get a warning message and your archive might have a problem for the changed file. This happens when running this in a logs directory, a temp dir, etc.. That's the only issue I've ever had, really nothing more than a heads up.

Advanced:

This is a simple alias, and very useful as it works on basically every linux box with semi-current tar and GNU coreutils, bash, and sed.. But if you want to customize it or pass parameters (like a dir to backup instead of pwd), check out this function I use.. this is what I created the alias from BTW, replacing my aa_status function with logger, and adding $SECONDS runtime instead of using tar's --totals

function tarred ()

{

local GZIP='--fast' PWD=${1:-`pwd`} F=$(date +${BKDIR}/%m-%d-%g-%H%M-`sed -u 's/[\/\ ]/#/g'

[[ ! -r "$PWD" ]] && echo "Bad permissions for $PWD" 1>&2 && return 2;

( ( tar --totals --ignore-failed-read --transform "s@^${PWD%/*}@`date +${PWD%/*}.%m-%d-%g`@S" -czPf $F $PWD && aa_status "Completed Tarp of $PWD to $F" ) & )

}

#From my .bash_profile http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

tar cfJ tarfile.tar.xz pathnames
2010-11-18 05:34:17
User: jasonjgw
Functions: tar
-1

The J option is a recent addition to GNU tar. The xz compression utility is required as well.

tar cfz backup-`date +%F`.tgz somedirs
tar cfz backup-$(date --iso).tar.gz somedirs
tar cfX - exclude_opt_weblogic . | ssh tmp-esxsb044 "cd /opt/weblogic ; tar xf -"
atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.tar.gz} && tar xf $1 -C ${1%.tar.gz}; fi ;}
2010-10-16 05:50:32
User: elfreak
Functions: echo grep head mkdir tar wc
10

This Anti-TarBomb function makes it easy to unpack a .tar.gz without worrying about the possibility that it will "explode" in your current directory. I've usually always created a temporary folder in which I extracted the tarball first, but I got tired of having to reorganize the files afterwards. Just add this function to your .zshrc / .bashrc and use it like this;

atb arch1.tar.gz

and it will create a folder for the extracted files, if they aren't already in a single folder.

This only works for .tar.gz, but it's very easy to edit the function to suit your needs, if you want to extract .tgz, .tar.bz2 or just .tar.

More info about tarbombs at http://www.linfo.org/tarbomb.html

Tested in zsh and bash.

UPDATE: This function works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in zsh (not working in bash):

atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t(ar.gz||ar.bz2||gz||bz||ar)} && tar xf $1 -C ${1%.t(ar.gz||ar.bz2||gz||bz||ar)}; fi ;}

UPDATE2: From the comments; bepaald came with a variant that works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in bash:

atb() {shopt -s extglob ; l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)} && tar xf $1 -C ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)}; fi ; shopt -u extglob}
tar -xfv archive.zip
2010-10-14 08:19:16
User: vxbinaca
Functions: tar
-4

Simplicity tends to win out on commandlinefu.com Also, why type multiple filenames when range operators work too. Saves finger abuse and time and reduces the chances for mistakes.

tar --exclude=".??*" -zcvf ./home_backup_2008.tar.gz my_home
wget http://forums.dropbox.com && wget $(cat index.html|grep "Latest Forum Build"|cut -d"\"" -f2) && wget $(cat topic.php*|grep "Linux x86:"|cut -d"\"" -f2|sort -r|head -n1) && rm -rf ~/.dropbox* && rm index.html *.php* && tar zxvf dropbox-*.tar.gz -C ~/
tar -c directory_to_compress/ | pbzip2 -vc > myfile.tar.bz2
pbzip2 -dck <bz2file> | tar xvf -
tar -cf - ./file | lzma -c | ssh user@sshserver $(cd /tmp; tar --lzma -xf -)
tar -xi < *.tar
2010-08-06 06:15:15
User: zolden
Functions: tar
1

tar doesn't support wildcard for unpacking (so you can't use tar -xf *.tar) and it's shorter and simpler than

for i in *.tar;do tar -xf $i;done (or even 'for i in *.tar;tar -xf $i' in case of zsh)

-i says tar not to stop after first file (EOF)