Commands matching rsync (113)

  • cloning root filesystem without suffering to possible interruptions. useful when moving a running system to a new partition. also works as a solid backup solution.


    0
    rsync -aHux --exclude=/proc/* --exclude=/sys/* /* /mnt/target/
    unixmonkey24812 · 2011-08-22 14:26:56 4
  • a : to keep files permissions --no-whole file : use rsync?s delta-transfer algorithm --inplace : writes the updated data directly to the destination file optionnal -> add --remove-source-files to mv instead of cp


    0
    rsync -aP --no-whole-file --inplace
    jlaunay · 2012-01-29 18:39:31 5
  • Useful when upgrading my Linux distro and trying to copy only "settings" from the old home folder to the new one.


    0
    rsync -a /path/from/.[^.]* /path/to
    magbeat · 2012-03-19 22:08:54 7
  • Script has to be modified to be executable! SRC and DEST a relative unix path, olds and news are the terms to be modified. Very helpful to sync source folders present in different SCM. If you dont like this one, just use rsync... Show Sample Output


    0
    groovy -e "def output=args[0]; def terms = args[1].split(','); terms.each { it -> def keyValues = it.split(':'); output = output.replaceAll(keyValues[0],keyValues[1]); } println output;" "`diff -rq . SRC DEST`" "old1:new1,old2:new2"
    airline · 2012-03-24 02:52:22 4
  • rsync will copy the source directory into destination and any subsequent run will synchronize only the changes from the source. Show Sample Output


    0
    rsync -avz ~/src ~/des/
    axelabs · 2012-06-01 15:08:12 9
  • Copy files and dir in parallel. It is Faster. Using the gnu-parallel. Has 2 commands: - First - Create dir structure in /BKP find Files/ -type d | parallel 'mkdir -p /BKP/{}' - Second - Copy for structure created find Files/ -type f | parallel 'rsync -a {} /BKP/$(dirname {})' - Great for backups! - Can use "rsync" or "cp". - Compare with a simple "rsync" or "cp"!


    0
    find Files/ -type d | parallel 'mkdir -p /BKP/{}' && find Files/ -type f | parallel 'rsync -a {} MKD/$(dirname {})'
    phribbr · 2012-08-08 21:01:37 17

  • 0
    rsync -arl --rsh=ssh --progress --exclude-from=/etc/localbin/exclude_files.txt /var/www/html/source/* <user>@<server>:/var/www/html/source/
    szimbaro · 2013-02-27 13:45:43 5
  • This line unbuffers the interactive output of rsync's --progress flag creating a new line for every update. This output can now be used within a script to make actions (or possibly piped into a GUI generator for a progress bar)


    0
    rsync --progress user@host:/path/to/source /path/to/target/ | stdbuf -oL tr '\r' '\n' >> rsyncprogress.txt
    stew_rt · 2013-03-26 11:06:45 6
  • While edtiing a project under git, it is sometimes nice to sync changes immediately to a test machine. This command will take care of this if you have inotifywait installed on the developement machine. Note the -R (relative) in rsync. with rsync foo/bar/baz user@host:dest/dir/ it will put 'baz' in dest/dir/foo/bar/ which is what we want. this can be turned into a function for additionnal flexibility : function gitwatch() { if [ -z $1 ]; then echo "You must provide a rsync destination" return fi while true; do rsync -vR $(git ls-files | inotifywait -q -e modify -e attrib -e close_write --fromfile - --format '%w') $1 done }


    0
    while true; do rsync -vR $(git ls-files | inotifywait -q -e modify -e attrib -e close_write --fromfile - --format '%w') user@host:dest/dir/; done
    leucos · 2014-01-21 10:31:41 6
  • This command allows you to mirror folders or files with rsync using a secure SSH channel with a forced HMAC integrity algorithm. Use this if you are absolutely adamant about preserving data integrity while mirroring a set of files. --partial is for resumability.


    0
    rsync -av -e "ssh -o MACs=hmac-ripemd160" --progress --partial user@remotehost://path/to/remote/stuff .
    RAKK · 2014-02-01 00:46:38 6
  • forgot to use a pv or rsync and want to know how much has been copied. Show Sample Output


    0
    watch ls -lh /path/to/folder
    vonElfensenf · 2014-03-27 10:51:36 8
  • Copy a local directory to a remote server using ssh+tar (assume server is lame and does not have rsync).


    0
    tar -vzc /path/to/cool/directory | ssh -q my_server 'tar -vzx -C /'
    regulatre · 2014-07-31 18:42:57 4
  • Useful, when you need to backup/copy/sync a folder over ssh with a non standard port number


    0
    rsync -arvz -e 'ssh -p 2233' --progress --delete remote-user@remote-server.org:/path/to/folder /path/to/local/folder
    nadavkav · 2014-09-26 10:42:26 8
  • Functionally the same as the Microsoft Robocopy (https://en.wikipedia.org/wiki/Robocopy) command below but with the benefits of compression and optionally specifying a user. robocopy /e [//host]/source/path [//host]/destination/path Options: -a: archive mode - rescursive, copy symlinks as symlinks, preserve permissions, preserve modification times, preserve group, preserve owner, preserve device files and special files -hh: Numbers in human-readable K=1024 format. Single "h" will produce human-readable K=1000 format -m: don't copy empty directories -z: use compression (if both source and destination are local it's faster to omit this) --progress: Shows progress during the transfer and implies --verbose (verbose output) --stats: Summary after the transfer stops Show Sample Output


    0
    rsync -ahhmz --progress --stats [[user@]host:]/source/path/ [[user@]host:]/destination/path/
    juangmorales · 2014-11-13 18:52:45 20
  • Include in your .bashrc


    0
    alias smv="rsync --remove-source-files -varP"
    debuti · 2014-11-17 08:07:59 8
  • To help store and keep important but not often used commands I resorted to this. A basic for loop which when fed separate commands for its input searches the history and any references to that command or string gets appended to a file named [command name]_hist.txt Revising it to the following though ti include root / sudo'd commands is probably critical the output above reflects the change, here below: for i in docker elinks ufw fail2ban awk sed grep diff nginx apt bash for function bower github rsync sshfs who scp sftp tugboat aws pip npm ssh mysql php 8000 8080 3000 python serve s3ql s3cmd s3api s3 bash init wget; do cat /home/ray/.bash_history |grep -i "$i" >> /home/ray/histories/"${i}"_hist.txt;sudo cat /root/.bash_history |grep -i "$i" >> /home/ray/histories/"${i}"_sudo_hist.txt;done then a simple more to look for a particular result more -s -40 -p -f -d tugboat*txt simple, solved my problem and alerted me to a lack of certain appearances of commands that signal a bit of an issue Not so sold on the usefulness as to warrant a bash function or further convenience or logic we shall see. Could use some tweaking but what commands dont! Show Sample Output


    0
    for i in [enter list of commands]; do history |grep -i "$i" >> ~/histories/"${i}"_hist.txt;done
    rayanthony · 2014-12-16 03:37:02 8

  • 0
    rsync -v --ignore-existing `ls | head -n 40` root@localhost:/location
    zluyuer · 2014-12-16 04:08:59 8
  • With these options rsync wont waste time copying over files that are already present and will show you whats going on. Optionally you can add in the option --Dry-run to see what will be changed without actually changing anything.


    0
    sudo rsync -aXS --ignore-existing --update --progress
    hobopowers · 2015-05-06 13:19:16 12
  • built-in command to see the last command executed by some users Show Sample Output


    0
    lastcomm rsync
    seca · 2015-05-21 15:18:42 9
  • run rsync over ssh where root login is not allowed. You need to have "ForwardAgent yes" in your ssh config and run ssh-add before you login to the machine where you want to sync the files to.


    0
    rsync -rlptgozP -e "ssh" --rsync-path="sudo rsync" user@nodename:/folder/ /folder
    Raboo · 2015-10-08 09:57:57 11
  • Especially useful while syncing to Amazon EC2 instance. avz stands for archive verbose compress


    0
    rsync -e 'ssh -i /root/my.pem' -avz /mysql/db/data_summary.* ec2-1-2-4-9.compute-1.amazonaws.com:/mysql/test/
    shantanuo · 2020-02-11 13:45:32 114
  • This command will backup the entire / directory, excluding /dev, /proc, /sys, /tmp, /run, /mnt, /media, /lost+found directories. Let us break down the above command and see what each argument does. rsync: A fast, versatile, local and remote file-copying utility -aAXv: The files are transferred in ?archive? mode, which ensures that symbolic links, devices, permissions, ownerships, modification times, ACLs, and extended attributes are preserved. -/: Source directory -exclude: Excludes the given directories from backup. -/mnt: It is the backup destination folder. Please be mindful that you must exclude the destination directory, if it exists in the local system. It will avoid the an infinite loop. To restore the backup, just reverse the source and destination paths in the above command.


    0
    rsync -aAXv / --exclude={"/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/home/*","/lost+found/*"} <backup path> > <path_of log file>
    vinabb · 2017-07-26 13:33:50 18

  • 0
    rsync -rP --exclude=x source/ target/
    ctcrnitv · 2017-08-03 22:08:01 17

  • 0
    rsync -avz -e "ssh -p $portNumber" user@remote.host:/path/to/copy /local/path
    aysadk · 2017-09-08 17:51:55 26

  • 0
    rsync -chavzP --stats user@remote.host:/path/to/copy /path/to/local/storage
    aysadk · 2017-09-08 17:52:15 19
  • ‹ First  < 2 3 4 5 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Display duplicated lines in a file
Displays the duplicated lines in a file and their occuring frequency.

type partial command, kill this command, check something you forgot, yank the command, resume typing.
Example : $ vim /etc/fstab ## damn $ $ sudo ## like a boss. Example 2 : $ sudo vim /root/bin/ ##uh... autocomplete doesn't work... $ $ sudo ls /root/bin ##ah! that's the name of the file! $ sudo vim /root/bin/ ##resume here! Thanks readline!

Always tail/edit/grep the latest file in a directory of timestamped files
zsh only If you have this command in your history, you can always re-run it and have it reference the latest file. The glob matches all timestamped files and then the resulting array is sorted by modification time (m) and then the first element in the sorted array is chosen (the latest)

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Get and read log from remote host (works with log on pipe, too)

Check if it's your binary birthday!
Print out your age in days in binary. Today's my binary birthday, I'm 2^14 days old :-) . This command does bash arithmatic $(( )) on two dates: Today: $(date +%s) Date of birth: $(date +%s -d YYYY-MM-DD) The dates are expressed as the number of seconds since the Unix epoch (Jan 1970), so we devide the difference by 86400 (seconds per day). . Finally we pipe "obase=2; DAYS-OLD" into bc to convert to binary. (obase == output base)

Using numsum to sum a column of numbers.
if you, like me, do not have the numsum, this way can do the same.

Preserve user variables when running commands with sudo.
In this case the current user has proxy variable set which allows access to the rpm on the internet but needs root privs to install it. Running sudo -E preserves the current user proxy var and allows the rpm install to be executed with sudo.

return external ip
curl inet-ip.info -> 113.33.232.62\n curl inet-ip.info/ip -> 113.33.232.62 curl inet-ip.info/json -> JSON print curl inet-ip.info/json/indent -> JSON pretty print curl inet-ip.info/yaml -> YAML format curl inet-ip.info/toml -> TOML format http://inet-ip.info

rename files according to date created
The command renames all files in a certain directory. Renaming them to their date of creation using EXIF. If you're working with JPG that contains EXIF data (ie. from digital camera), then you can use following to get the creation date instead of stat. * Since not every file has exif data, we want to check that dst is valid before doing the rest of commands. * The output from exif has a space, which is a PITA for filenames. Use sed to replace with '-'. * Note that I use 'echo' before the mv to test out my scripts. When you're confident that it's doing the right thing, then you can remove the 'echo'... you don't want to end up like the guy that got all the files blown away. Credits: http://stackoverflow.com/questions/4710753/rename-files-according-to-date-created


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: