Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using tar from sorted by
Terminal - Commands using tar - 196 results
cat 1.tar.gz 2.tar.gz > 3.tar.gz; tar zxvfi 3.tar.gz
ssh user@<source_host> -- tar cz <path> | ssh user@<destination_host> -- tar vxzC <path>
tar pcf - home | pv -s $(du -sb home | awk '{print $1}') --rate-limit 500k | gzip > /mnt/c/home.tar.gz
2010-04-02 15:29:03
User: Sail
Functions: awk du gzip tar
1

tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers.

ls | while read filename; do tar -czvf "$filename".tar.gz "$filename"; rm "$filename"; done
2010-03-29 08:10:38
User: Thingymebob
Functions: ls read rm tar
-2

Compresses each file individually, creating a $fileneame.tar.gz and removes the uncompressed version, usefull if you have lots of files and don't want 1 huge archive containing them all. you could replace ls with ls *.pdf to just perform the action on pdfs for example.

ssh user@host "cat /path/to/backup/backupfile.tar.bz2" |tar jpxf -
2010-03-24 01:35:28
User: mack
Functions: ssh tar
Tags: ssh tar
8

Here how to recover the remote backup over ssh

tar jcpf - [sourceDirs] |ssh user@host "cat > /path/to/backup/backupfile.tar.bz2"
2010-03-24 01:29:25
User: mack
Functions: ssh tar
Tags: ssh tar
13

Execute it from the source host, where the source files you wish backup resides. With the minus '-' the tar command deliver the compressed output to the standar output and, trough over the ssh session to the remote host. On the other hand the backup host will be receive the stream and read it from the standar input sending it to the /path/to/backup/backupfile.tar.bz2

Server: nc -l 1234 |tar xvfpz - ;Client: tar zcfp - /path/to/dir | nc localhost 1234
2010-03-02 14:24:04
Functions: tar
2

Create a tarball on the client and send it across the network with netcat on port 1234 where its extracted on the server in the current directory.

ssh -c 'tar cvzf - -C /path/to/src/*' | tar xzf -
2010-03-02 14:15:17
Functions: ssh tar
0

Create tarball on stdout which is piped to tar reading from stdin all over ssh

wget -qO - http://example.com/path/to/blah.tar.gz | tar xzf -
tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5 -type f` 2> /dev/null | parallel -X rm -f
2010-01-28 12:41:41
Functions: rm tar
-3

This deals nicely with files having special characters in the file name (space ' or ").

Parallel is from https://savannah.nongnu.org/projects/parallel/

svn st | cut -c 9- | parallel -X tar -czvf ../backup.tgz
2010-01-28 11:43:16
Functions: cut tar
-2

xargs deals badly with special characters (such as space, ' and "). In this case if you have a file called '12" record'.

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

Both solutions work bad if the number of files is more than the allowed line length of the shell.

tar -tf <file.tar.gz> | parallel rm
2010-01-28 08:28:16
Functions: tar
-2

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

tar --exclude-vcs -cf src.tar src/
dir='path to file'; tar cpf - "$dir" | pv -s $(du -sb "$dir" | awk '{print $1}') | tar xpf - -C /other/path
2010-01-19 19:05:45
User: starchox
Functions: awk dir du tar
Tags: copy tar cp
-2

This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.

You also may keep simple with, but you don't have the progress info:

cp -rpf /some/directory /other/path
file='path to file'; tar -cf - "$file" | pv -s $(du -sb "$file" | awk '{print $1}') | gzip -c | ssh -c blowfish user@host tar -zxf - -C /opt/games
2010-01-19 16:02:45
User: starchox
Functions: awk du file gzip ssh tar
3

You set the file/dirname transfer variable, in the end point you set the path destination, this command uses pipe view to show progress, compress the file outut and takes account to change the ssh cipher. Support dirnames with spaces.

Merged ideas and comments by http://www.commandlinefu.com/commands/view/4379/copy-working-directory-and-compress-it-on-the-fly-while-showing-progress and http://www.commandlinefu.com/commands/view/3177/move-a-lot-of-files-over-ssh

wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
2010-01-17 11:25:47
User: oshazard
Functions: cd sed tar wget
-1

Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion.

1.) wget source.tar.gz

2.) tar xzvf source.tar.gz

3.) cd source

4.) ls

From there you can run ./configure, make and etc.

curl http://example.com/a.gz | tar xz
find . \! -type d | rev | sort | rev | tar c --files-from=- --format=ustar | bzip2 --best > a.tar.bz2
2009-12-20 14:04:39
User: pornel
Functions: bzip2 c++ find rev sort tar
2

Avoids creating useless directory entries in archive, and sorts files by (roughly) extension, which is likely to group similar files together for better compression. 1%-5% improvement.

wget -O - http://example.com/a.gz | tar xz
tar -cf - . | pv -s $(du -sb . | awk '{print $1}') | gzip > out.tgz
2009-12-18 17:09:08
User: opertinicy
Functions: awk du gzip tar
25

What happens here is we tell tar to create "-c" an archive of all files in current dir "." (recursively) and output the data to stdout "-f -". Next we specify the size "-s" to pv of all files in current dir. The "du -sb . | awk ?{print $1}?" returns number of bytes in current dir, and it gets fed as "-s" parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way "pv" knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish.

Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/

tar -cvzf arch.tgz $(find /path/dir -not -type d)
2009-12-15 13:46:54
User: pysquared
Functions: find tar
3

If you give tar a list of filenames, it will not add the directories, so if you don't care about directory ownership or permissions, you can save some space.

Tar will create directories as necessary when extracting.

This command is limited by the maximum supported size of the argument list, so if you are trying to tar up the whole OS for instance, you may just get "Argument list too long".

tar -cvf /path/dir.tar /path/dir* --exclude "/path/dir/name" --exclude "/path/dir/opt"
tar -tf /path/to/file.tar
tar cf - <dir>|split -b<max_size>M - <name>.tar.
2009-11-11 01:53:33
User: dinomite
Functions: split tar
17

Create a tar file in multiple parts if it's to large for a single disk, your filesystem, etc.

Rejoin later with `cat .tar.*|tar xf -`

for F in $(find ./ -name "*.tgz") ; do tar -tvzf $F ; done
2009-11-11 00:50:52
User: alchandia
Functions: find tar
Tags: tar
-2

The magic is performed by the parameter -t