Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using rsync from sorted by
Terminal - Commands using rsync - 69 results
find . -name "whatever.*" -print0 | rsync -av --files-from=- --from0 ./ ./destination/
rsync -az /home/user/test user@sshServer:/tmp/
2009-08-25 10:45:15
User: peshay
Functions: rsync
Tags: ssh file move
14

copy files to a ssh server with gzip compression

rsync --partial --progress --rsh=ssh user@host:remote-file local-file
2009-08-25 09:32:07
User: alvinx
Functions: rsync
2

resume a partial scp-filetransfer with rsync

rsync -a --delete --link-dest=../lastbackup $folder $dname/
2009-08-04 07:08:54
User: pamirian
Functions: rsync
6

dname is a directory named something like 20090803 for Aug 3, 2009. lastbackup is a soft link to the last backup made - say 20090802. $folder is the folder being backed up. Because this uses hard linking, files that already exist and haven't changed take up almost no space yet each date directory has a kind of "snapshot" of that day's files. Naturally, lastbackup needs to be updated after this operation. I must say that I can't take credit for this gem; I picked it up from somewhere on the net so long ago I don't remember where from anymore. Ah, well...

Systems that are only somewhat slicker than this costs hundreds or even thousands of dollars - but we're HACKERS! We don't need no steenkin' commercial software... :)

rsync -rtvu --modify-window=1 --progress /media/SOURCE/ /media/TARGET/
2009-07-05 07:40:10
User: 0x2142
Functions: rsync
Tags: backup rsync
12

This will backup the _contents_ of /media/SOURCE to /media/TARGET where TARGET is formatted with ntfs. The --modify-window lets rsync ignore the less accurate timestamps of NTFS.

rsync -e 'ssh -p PORT' user@host:SRC DEST
2009-06-05 16:52:43
Functions: rsync
3

tested on cygwin and Fedora 9 .

good to remember for those jobs where you cannot set a site-specific connect option in your ~/.ssh/config file.

rsync --partial --progress --rsh=ssh $file_source $user@$host:$destination_file
2009-04-01 13:13:14
User: dr_gogeta86
Functions: rsync
23

It can resume a failed secure copy ( usefull when you transfer big files like db dumps through vpn ) using rsync.

It requires rsync installed in both hosts.

rsync --partial --progress --rsh=ssh $file_source $user@$host:$destination_file local -> remote

or

rsync --partial --progress --rsh=ssh $user@$host:$remote_file $destination_file remote -> local

curlftpfs ftp://YourUsername:YourPassword@YourFTPServerURL /tmp/remote-website/ && rsync -av /tmp/remote-website/* /usr/local/data_latest && umount /tmp/remote-website
2009-03-31 18:01:00
User: nadavkav
Functions: rsync umount
7

connect to a remote server using ftp protocol over FUSE file system, then rsync the remote folder to a local one and then unmount the remote ftp server (FUSE FS)

it can be divided to 3 different commands and you should have curlftpfs and rsync installed

rsync -avz -e 'ssh -A sshproxy ssh' srcdir remhost:dest/path/
2009-03-25 21:29:07
User: totoro
Functions: rsync
5

If you have lots of remote hosts sitting "behind" an ssh proxy host, then there is a special-case use of "rsynch" that allows one to easily copy directories and files across the ssh proxy host, without having to do two explicit copies: the '-e' option allows for a replacement "rsh" command. We use this option to specify an "ssh" tunnel command, with the '-A' option that causes authentication agent requests to be forwarded back to the local host. If you have ssh set up correctly, the above command can be done without any passwords being entered.

rsync --rsync-path 'sudo rsync' username@source:/folder/ /local/
2009-03-25 21:18:55
User: Alioth
Functions: rsync
Tags: rsync
18

If your user has sudo on the remote box, you can rsync data as root without needing to login as root. This is very helpful if the remote box does not allow root to login over SSH (which is a common security restriction).

rsync -avz -e ssh --files-from=<(find -mtime +30 -mtime -60) source dest
2009-03-13 12:58:28
User: voyeg3r
Functions: find rsync ssh
6

rsync from source to dest all between >30

rsync -vazuK --exclude "*.mp3" --exclude "*.svn*" * user@host:/path
2009-02-27 19:58:02
User: sudopeople
Functions: rsync
Tags: svn exclude
2

rsyncs files to a server excluding listed files

also a file can be used to exclude common exclude rules and/or to exclude a ton of files, like so:

rsync --exclude-from '~/.scripts/exclude.txt'

where exclude.txt has one rule per line:

*.mp3

*.svn*

rsync -Pz user@remotehost:/path/file.dat .
rsync -avz -e ssh username@hostname:/path/to/remote/dir/ /path/to/local/dir/
rsync --partial --progress --rsh=ssh SOURCE DESTINATION
2009-02-16 16:22:10
User: episodeiv
Functions: rsync
14

Put it into your sh startup script (I use

alias scpresume='rsync --partial --progress --rsh=ssh'

in bash). When a file transfer via scp has aborted, just use scpresume instead of scp and rsync will copy only the parts of the file that haven't yet been transmitted.

rsync -e "/usr/bin/ssh -p22" -a --progress --stats --delete -l -z -v -r -p /root/files/ user@remote_server:/root/files/
2009-02-11 02:44:00
User: storm
Functions: rsync
7

Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22)

(all files & folders on destination server/folder will be deleted)

rsync -zav --progress original_files_directory/ root@host(IP):/path/to/destination/
2009-02-10 17:01:04
User: haithamg
Functions: rsync
1

copying files from one server to another using rysnc. Root access need to be allowed on the destination.

rsync -av --progress ./file.txt user@host:/path/to/dir
2009-02-06 11:51:51
User: aoiaoi
Functions: rsync
7

transfer files from localhost to a remotehost.

rsync -av -e ssh user@host:/path/to/file.txt .
2009-01-26 13:39:24
User: root
Functions: rsync
1

You will be prompted for a password unless you have your public keys set-up.