Backup entire system

cd / ; tar -cvpzf backup.tar.gz --exclude=/backup.tar.gz --exclude=/proc --exclude=/lost+found --exclude=/sys --exclude=/mnt --exclude=/media --exclude=/dev /
Backup your entire system on a tar ball file format.

0
By: strzel_a
2011-07-20 15:44:07

These Might Interest You

  • this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally. (do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )


    4
    wget --http-user=YourUsername --http-password=YourPassword http://YourWebsiteUrl:2082/getbackup/backup-YourWebsiteUrl-`date +"%-m-%d-%Y"`.tar.gz
    nadavkav · 2009-03-31 17:50:41 1
  • this command can be added to crontab so as to execute a nightly backup of directories and store only the 10 last backup files.


    0
    for file in $(find /var/backup -name "backup*" -type f |sort -r | tail -n +10); do rm -f $file; done ; tar czf /var/backup/backup-system-$(date "+\%Y\%m\%d\%H\%M-\%N").tgz --exclude /home/dummy /etc /home /opt 2>&- && echo "system backup ok"
    akiuni · 2014-09-24 14:04:11 0
  • This command will backup the entire / directory, excluding /dev, /proc, /sys, /tmp, /run, /mnt, /media, /lost+found directories. Let us break down the above command and see what each argument does. rsync: A fast, versatile, local and remote file-copying utility -aAXv: The files are transferred in ?archive? mode, which ensures that symbolic links, devices, permissions, ownerships, modification times, ACLs, and extended attributes are preserved. -/: Source directory -exclude: Excludes the given directories from backup. -/mnt: It is the backup destination folder. Please be mindful that you must exclude the destination directory, if it exists in the local system. It will avoid the an infinite loop. To restore the backup, just reverse the source and destination paths in the above command.


    0
    rsync -aAXv / --exclude={"/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/home/*","/lost+found/*"} <backup path> > <path_of log file>
    vinabb · 2017-07-26 13:33:50 0
  • 'data' is the directory to backup, 'backup' is directory to store snapshots. Backup files on a regular basis using hard links. Very efficient, quick. Backup data is directly available. Same as explained here : http://blog.interlinked.org/tutorials/rsync_time_machine.html in one line. Using du to check the size of your backups, the first backup counts for all the space, and other backups only files that have changed. Show Sample Output


    1
    rsync -av --link-dest=$(ls -1d /backup/*/ | tail -1) /data/ /backup/$(date +%Y%m%d%H%M)/
    dooblem · 2010-08-05 19:36:24 0

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: