Commands tagged backup (65)

  • Remember to backup everything before changing it so you can restore all to normal.


    16
    cp httpd.conf{,.bk}
    ideivid · 2011-08-15 16:43:53 4
  • This will create an exact duplicate image of your hard drive that you can then restore by simply reversing the "if" & "of" locations. sudo dd if=/media/disk/backup/sda.backup of=/dev/sda Alternatively, you can use an SSH connection to do your backups: dd if=/dev/sda | ssh user@ssh.server.com dd of=~/backup/sda.backup


    15
    sudo dd if=/dev/sda of=/media/disk/backup/sda.backup
    bandit36 · 2009-02-27 20:23:37 2

  • 14
    curl -u username -o bookmarks.xml https://api.del.icio.us/v1/posts/all
    avi4now · 2009-04-06 13:54:15 3
  • This will backup the _contents_ of /media/SOURCE to /media/TARGET where TARGET is formatted with ntfs. The --modify-window lets rsync ignore the less accurate timestamps of NTFS.


    13
    rsync -rtvu --modify-window=1 --progress /media/SOURCE/ /media/TARGET/
    0x2142 · 2009-07-05 07:40:10 0
  • Use `zless` to read the content of your *rss.gz file: zless commandlinefu-contribs-backup-2009-08-10-07.40.39.rss.gz Show Sample Output


    10
    curl http://www.commandlinefu.com/commands/by/<your username>/rss|gzip ->commandlinefu-contribs-backup-$(date +%Y-%m-%d-%H.%M.%S).rss.gz
    linuxrawkstar · 2009-08-10 12:43:33 0
  • connect to a remote server using ftp protocol over FUSE file system, then rsync the remote folder to a local one and then unmount the remote ftp server (FUSE FS) it can be divided to 3 different commands and you should have curlftpfs and rsync installed


    9
    curlftpfs ftp://YourUsername:YourPassword@YourFTPServerURL /tmp/remote-website/ && rsync -av /tmp/remote-website/* /usr/local/data_latest && umount /tmp/remote-website
    nadavkav · 2009-03-31 18:01:00 3
  • A dear friend of mine asked me how do I copy a DVD to your hard drive? If you want to make a copy of the ISO image that was burned to a CD or DVD, insert that medium into your CD/DVD drive and (assuming /dev/cdrom is associated with your computer?s CD drive) type the following command


    9
    dd if=/dev/cdrom of=whatever.iso
    0disse0 · 2009-09-05 09:19:41 3
  • Suppose you made a backup of your hard disk with dd: dd if=/dev/sda of=/mnt/disk/backup.img This command enables you to mount a partition from inside this image, so you can access your files directly. Substitute PARTITION=1 with the number of the partition you want to mount (returned from sfdisk -d yourfile.img). Show Sample Output


    8
    INFILE=/path/to/your/backup.img; MOUNTPT=/mnt/foo; PARTITION=1; mount "$INFILE" "$MOUNTPT" -o loop,offset=$[ `/sbin/sfdisk -d "$INFILE" | grep "start=" | head -n $PARTITION | tail -n1 | sed 's/.*start=[ ]*//' | sed 's/,.*//'` * 512 ]
    Alanceil · 2009-03-06 21:29:13 3
  • This is freaking sweet!!! Here is the full alias, (I didn't want to cause display problems on commandlinefu.com's homepage): alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); S=$SECONDS; tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$"F "$D" && logger -s "Tarred $D to $F in $(($SECONDS-$S)) seconds" ) & )' Creates a .tgz archive of whatever directory it is run from, in the background, detached from current shell so if you logout it will still complete. Also, you can run this as many times as you want, if the archive .tgz already exists, it just moves it to a numbered backup '--backup=numbered'. The coolest part of this is the transformation performed by tar and sed so that the archive file names are automatically created, and when you extract the archive file it is completely safe thanks to the transform command. If you archive lets say /home/tombdigger/new-stuff-to-backup/ it will create the archive /home/#home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz Then when you extract it, like tar -xvzf #home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz instead of overwriting an existing /home/tombdigger/new-stuff-to-backup/ directory, it will extract to /home/tombdigger/new-stuff-to-backup.2010-11-18/ Basically, the tar archive filename is the PWD with all '/' replaced with '#', and the date is appended to the name so that multiple archives are easily managed. This example saves all archives to your $HOME/archive-name.tgz, but I have a $BKDIR variable with my backup location for each shell user, so I just replaced HOME with BKDIR in the alias. So when I ran this in /opt/askapache/SOURCE/lockfile-progs-0.1.11/ the archive was created at /askapache-bk/#opt#askapache#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz Upon completion, uses the universal logger tool to output its completion to syslog and stderr (printed to your terminal), just remove that part if you don't want it, or just remove the '-s ' option from logger to keep the logs only in syslog and not on your terminal. Here's how my syslog server recorded this.. 2010-11-18T00:44:13-05:00 gravedigger.askapache.com (127.0.0.5) [user] [notice] (logger:) Tarred /opt/askapache/SOURCE/lockfile-progs-0.1.11 to /askapache-bk/tarred/#opt#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz in 4 seconds Caveats Really this is very robust and foolproof, the only issues I ever have with it (I've been using this for years on my web servers) is if you run it in a directory and then a file changes in that directory, you get a warning message and your archive might have a problem for the changed file. This happens when running this in a logs directory, a temp dir, etc.. That's the only issue I've ever had, really nothing more than a heads up. Advanced: This is a simple alias, and very useful as it works on basically every linux box with semi-current tar and GNU coreutils, bash, and sed.. But if you want to customize it or pass parameters (like a dir to backup instead of pwd), check out this function I use.. this is what I created the alias from BTW, replacing my aa_status function with logger, and adding $SECONDS runtime instead of using tar's --totals function tarred () { local GZIP='--fast' PWD=${1:-`pwd`} F=$(date +${BKDIR}/%m-%d-%g-%H%M-`sed -u 's/[\/\ ]/#/g' [[ ! -r "$PWD" ]] && echo "Bad permissions for $PWD" 1>&2 && return 2; ( ( tar --totals --ignore-failed-read --transform "s@^${PWD%/*}@`date +${PWD%/*}.%m-%d-%g`@S" -czPf $F $PWD && aa_status "Completed Tarp of $PWD to $F" ) & ) } #From my .bash_profile http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    8
    alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$F" "$D" &>/dev/null ) & )'
    AskApache · 2010-11-18 06:24:34 0
  • create an archive of files with access time older than 5 days, and remove original files.


    7
    tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5` 2> /dev/null | xargs rm -fr ;
    angleto · 2009-05-26 17:15:52 2
  • This is a BASH feature. The above command will create a backup of "filename" called "filename.DATE", where DATE is the actual day in %Y%m%d (year, month and day numbers together) format.


    6
    cp filename{,.`date +%Y%m%d`}
    fernandomerces · 2011-04-02 06:41:26 0
  • this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally. (do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )


    5
    wget --http-user=YourUsername --http-password=YourPassword http://YourWebsiteUrl:2082/getbackup/backup-YourWebsiteUrl-`date +"%-m-%d-%Y"`.tar.gz
    nadavkav · 2009-03-31 17:50:41 1
  • This command clone the first partition of the primary master IDE drive to the second partition of the primary slave IDE drive (!!! back up all data before trying anything like this !!!)


    5
    sudo dd if=/dev/hda1 of=/dev/hdb2
    0disse0 · 2009-09-05 09:16:52 0

  • 5
    mysqldump -uUSERNAME -pPASSWORD database | gzip > /path/to/db/files/db-backup-`date +%Y-%m-%d`.sql.gz ;find /path/to/db/files/* -mtime +5 -exec rm {} \;
    nadavkav · 2009-10-28 19:49:39 1
  • This script creates date based backups of the files. It copies the files to the same place the original ones are but with an additional extension that is the timestamp of the copy on the following format: YearMonthDay-HourMinuteSecond Show Sample Output


    5
    backup() { for i in "$@"; do cp -va $i $i.$(date +%Y%m%d-%H%M%S); done }
    polaco · 2009-11-10 20:59:45 2
  • Sometimes in a hurry you may move or copy a file using an already existent file name. If you aliased the cp and mv command with the -i option you are prompted for a confirmation before overwriting but if your aliases aren't there you will loose the target file! The -b option will force the mv command to check if the destination file already exists and if it is already there a backup copy with an ending ~ is created.


    5
    mv -b old_file_name new_and_already_existent_file_name
    ztank1013 · 2011-09-08 23:57:15 0
  • Apart from an exact copy of your recent contents, also keep all earlier versions of files and folders that were modified or deleted. Inspired by the excellent EVACopy http://evacopy.sourceforge.net Show Sample Output


    5
    backup() { source=$1 ; rsync --relative --force --ignore-errors --no-perms --chmod=ugo=rwX --delete --backup --backup-dir=$(date +%Y%m%d-%H%M%S)_Backup --whole-file -a -v $source/ ~/Backup ; } ; backup /source_folder_to_backup
    pascalv · 2018-08-02 21:27:29 0
  • Instead of calculating the offset and providing an offset option to mount, let lomount do the job for you by just providing the partition number you would like to loop mount.


    4
    lomount -diskimage /path/to/your/backup.img -partition 1 /mnt/foo
    olorin · 2009-07-22 11:32:52 1
  • The coolest way I've found to backup a wordpress mysql database using encryption, and using local variables created directly from the wp-config.php file so that you don't have to type them- which would allow someone sniffing your terminal or viewing your shell history to see your info. I use a variation of this for my servers that have hundreds of wordpress installs and databases by using a find command for the wp-config.php file and passing that through xargs to my function. Show Sample Output


    4
    eval $(sed -n "s/^d[^D]*DB_\([NUPH]\)[ASO].*',[^']*'\([^']*\)'.*/_\1='\2'/p" wp-config.php) && mysqldump --opt --add-drop-table -u$_U -p$_P -h$_H $_N | gpg -er AskApache >`date +%m%d%y-%H%M.$_N.sqls`
    AskApache · 2009-08-18 07:03:08 0
  • Use if you have pictures all over the place and you want to copy them to a central location Synopsis: Find jpg files translate all file names to lowercase backup existing, don't overwrite, preserve mode ownership and timestamps copy to a central location


    4
    find . -iname "*.jpg" -print0 | tr '[A-Z]' '[a-z]' | xargs -0 cp --backup=numbered -dp -u --target-directory {location} &
    oracular · 2009-12-10 08:47:04 0
  • Prior to working on/modifying a file, use the 'install -m' command which can both copy files, create directories, and set their permissions at the same time. Useful when you are working in the public_html folder and need to keep the cp'd file hidden. Show Sample Output


    4
    install -m 0400 foo bar/
    op4 · 2015-03-02 13:20:38 0
  • 1. you don't need to prepend the year with 20 - just use Y instead of y 2. you may want to make your function a bit more secure: buf () { cp ${1?filename not specified}{,$(date +%Y%m%d_%H%M%S)}; }


    3
    buf () { cp $1{,$(date +%Y%m%d_%H%M%S)}; }
    unefunge · 2010-12-14 14:02:03 0
  • This will compress the root directory to an external hard drive and split it to parts once it reaches the 4 Gigs file system limit. You can simply restore it with: restore ivf /media/My\ Passport/Fedora10bckup/root_dump_fedora


    2
    dump -0 -M -B 4000000 -f /media/My\ Passport/Fedora10bckup/root_dump_fedora -z2 /
    luqmanux · 2009-07-02 20:25:22 0
  • Opens a snapshot of a live UFS2 filesystem, runs dump to generate a full filesystem backup which is run through gzip. The filesystem must support snapshots and have a .snap directory in the filesystem root. To restore the backup, one can do zcat /path/to/adXsYz.dump.gz | restore -rf -


    2
    dump -0Lauf - /dev/adXsYz | gzip > /path/to/adXsYz.dump.gz
    tensorpudding · 2010-07-19 00:54:40 2
  • 'data' is the directory to backup, 'backup' is directory to store snapshots. Backup files on a regular basis using hard links. Very efficient, quick. Backup data is directly available. Same as explained here : http://blog.interlinked.org/tutorials/rsync_time_machine.html in one line. Using du to check the size of your backups, the first backup counts for all the space, and other backups only files that have changed. Show Sample Output


    2
    rsync -av --link-dest=$(ls -1d /backup/*/ | tail -1) /data/ /backup/$(date +%Y%m%d%H%M)/
    dooblem · 2010-08-05 19:36:24 0
  •  1 2 3 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

Show directories in the PATH, one per line
Shorter version.

Search commandlinefu.com from the command line using the API
for me the above command didn't work for more than one argument but this one does

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

remove audio trac from a video file
create a copy of a video file without the audio tracs

Download latest NVIDIA Geforce x64 Windows driver
Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :) $wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)"

Tell local Debian machine to install packages used by remote Debian machine
(also works on Ubuntu) Copies the 'install,' 'hold,' 'deinstall' and 'purge' states of packages on the remote machine to be matched on the local machine. Note: if packages were installed on the local machine that were never installed on the remote machine, they will not be deinstalled by this operation.

Retrieve the size of a file on a server
- Where $URL is the URL of the file. - Replace the $2 by $3 at the end to get a human-readable size. Credits to svanberg @ ArchLinux forums for original idea. Edit: Replaced command with better version by FRUiT. (removed unnecessary grep)

split a multi-page PDF into separate files
Have to do this once per output file, because if device is 'pdfwrite', even if 'gs' sees '%d' in the OutputFile it still only creates one single output file. Embed it into a simple shell script if you want to split a document out into one file for every page.

list block devices
Shows all block devices in a tree with descruptions of what they are.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: