What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using tar from sorted by
Terminal - Commands using tar - 207 results
tar -c directory_to_compress/ | pbzip2 -vc > myfile.tar.bz2
pbzip2 -dck <bz2file> | tar xvf -
tar -cf - ./file | lzma -c | ssh [email protected] $(cd /tmp; tar --lzma -xf -)
tar -xi < *.tar
2010-08-06 06:15:15
User: zolden
Functions: tar

tar doesn't support wildcard for unpacking (so you can't use tar -xf *.tar) and it's shorter and simpler than

for i in *.tar;do tar -xf $i;done (or even 'for i in *.tar;tar -xf $i' in case of zsh)

-i says tar not to stop after first file (EOF)

tar cvzf - /wwwdata | ssh [email protected] "dd of=/backup/wwwdata.tar.gz"
tar tfz filename.tgz |xargs rm -Rf
COPYFILE_DISABLE=true tar cvf newTarFile.tar Directory/
2010-07-01 09:36:48
User: alainkaa
Functions: tar
Tags: macosx

Using the COPYFILE_DISABLE=true environment variable you can prevent tar from adding any ._-files to your .tar-file on Mac OS X.

find /protocollo/paflow -type f -mtime +5 | xargs tar -cvf /var/dump-protocollo/`date '+%d%m%Y'_archive.tar`
2010-06-29 12:43:30
User: 0disse0
Functions: find tar xargs
Tags: find tar dump

The following command finds all the files not modified in the last 5 days under /protocollo/paflow directory and creates an archive files under /var/dump-protocollo in the format of ddmmyyyy_archive.tar

tar -czf ../header.tar.gz $(find . -name *.h)
2010-06-27 23:44:48
Functions: find tar
Tags: Linux tar

This is a shortcut to tar up all files matching a wildcard. Tar doesn't have the --include (apparently).

tar zxvf package.tar.gz --strip 1
2010-05-18 21:01:23
User: voyeg3r
Functions: tar

if I need get olnly script.sh from "folder/script.sh"

cat 1.tar.gz 2.tar.gz | tar zxvif -
2010-05-09 03:50:00
Functions: cat tar

You don't need to create an intermediate file, just pipe the output directly to tar command and use stin as file (put a dash after the f flag).

cat 1.tar.gz 2.tar.gz > 3.tar.gz; tar zxvfi 3.tar.gz
ssh user@<source_host> -- tar cz <path> | ssh user@<destination_host> -- tar vxzC <path>
tar pcf - home | pv -s $(du -sb home | awk '{print $1}') --rate-limit 500k | gzip > /mnt/c/home.tar.gz
2010-04-02 15:29:03
User: Sail
Functions: awk du gzip tar

tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers.

ls | while read filename; do tar -czvf "$filename".tar.gz "$filename"; rm "$filename"; done
2010-03-29 08:10:38
User: Thingymebob
Functions: ls read rm tar

Compresses each file individually, creating a $fileneame.tar.gz and removes the uncompressed version, usefull if you have lots of files and don't want 1 huge archive containing them all. you could replace ls with ls *.pdf to just perform the action on pdfs for example.

ssh [email protected] "cat /path/to/backup/backupfile.tar.bz2" |tar jpxf -
2010-03-24 01:35:28
User: mack
Functions: ssh tar
Tags: ssh tar

Here how to recover the remote backup over ssh

tar jcpf - [sourceDirs] |ssh [email protected] "cat > /path/to/backup/backupfile.tar.bz2"
2010-03-24 01:29:25
User: mack
Functions: ssh tar
Tags: ssh tar

Execute it from the source host, where the source files you wish backup resides. With the minus '-' the tar command deliver the compressed output to the standar output and, trough over the ssh session to the remote host. On the other hand the backup host will be receive the stream and read it from the standar input sending it to the /path/to/backup/backupfile.tar.bz2

Server: nc -l 1234 |tar xvfpz - ;Client: tar zcfp - /path/to/dir | nc localhost 1234
2010-03-02 14:24:04
Functions: tar

Create a tarball on the client and send it across the network with netcat on port 1234 where its extracted on the server in the current directory.

ssh -c 'tar cvzf - -C /path/to/src/*' | tar xzf -
2010-03-02 14:15:17
Functions: ssh tar

Create tarball on stdout which is piped to tar reading from stdin all over ssh

wget -qO - http://example.com/path/to/blah.tar.gz | tar xzf -
tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5 -type f` 2> /dev/null | parallel -X rm -f
2010-01-28 12:41:41
Functions: rm tar

This deals nicely with files having special characters in the file name (space ' or ").

Parallel is from https://savannah.nongnu.org/projects/parallel/

svn st | cut -c 9- | parallel -X tar -czvf ../backup.tgz
2010-01-28 11:43:16
Functions: cut tar

xargs deals badly with special characters (such as space, ' and "). In this case if you have a file called '12" record'.

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

Both solutions work bad if the number of files is more than the allowed line length of the shell.

tar -tf <file.tar.gz> | parallel rm
2010-01-28 08:28:16
Functions: tar

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

tar --exclude-vcs -cf src.tar src/
dir='path to file'; tar cpf - "$dir" | pv -s $(du -sb "$dir" | awk '{print $1}') | tar xpf - -C /other/path
2010-01-19 19:05:45
User: starchox
Functions: awk dir du tar
Tags: copy tar cp

This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.

You also may keep simple with, but you don't have the progress info:

cp -rpf /some/directory /other/path