Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged backup from sorted by
Terminal - Commands tagged backup - 55 results
7zr a -mx=9 -ms=on -mhc=on -mtc=off db_backup.sql.7z db_dump.sql
2010-10-22 21:05:58
User: mandx
1

These re the best option combination that works fine for compressing my database dumps. It's possible that there are another option or value that might improve the compression ratio, by these are the ones that worked, the syntax for 7zr it's a little messy...

mksnap_ffs /var /var/.snap/snap_var_`date "+%Y-%m-%d"` ; mdconfig -a -t vnode -f /var/.snap/snap_var_`date "+%Y-%m-%d"` -u 1; mount -r /dev/md1 /mnt
2010-09-18 11:37:03
User: bugmenot
Functions: mount
0

(FreeBSD)

Once you've made the snapshot you can resume any stopped services and then back up the file system (using the snapshot) without having to worry about changed files.

When finished, the snapshot can be removed :

umount /mnt

mdconfig -d -u 1

rm /var/.snap/snap_var_`date "+%Y-%m-%d"`

rsync -av --link-dest=$(ls -1d /backup/*/ | tail -1) /data/ /backup/$(date +%Y%m%d%H%M)/
2010-08-05 19:36:24
User: dooblem
Functions: date ls rsync tail
Tags: backup rsync
1

'data' is the directory to backup, 'backup' is directory to store snapshots.

Backup files on a regular basis using hard links. Very efficient, quick. Backup data is directly available.

Same as explained here :

http://blog.interlinked.org/tutorials/rsync_time_machine.html

in one line.

Using du to check the size of your backups, the first backup counts for all the space, and other backups only files that have changed.

dump -0Lauf - /dev/adXsYz | gzip > /path/to/adXsYz.dump.gz
2010-07-19 00:54:40
Functions: dump gzip
2

Opens a snapshot of a live UFS2 filesystem, runs dump to generate a full filesystem backup which is run through gzip. The filesystem must support snapshots and have a .snap directory in the filesystem root.

To restore the backup, one can do

zcat /path/to/adXsYz.dump.gz | restore -rf -
rsync --delete -az -e 'ssh -c blowfish -i /your/.ssh/backup_key -ax' /path/to/backup remote-host:/dest/path/
split -b4m file.tgz file.tgz. ; for i in file.tgz.*; do SUBJ="Backup Archive"; MSG="Archive File Attached"; echo $MSG | mutt -a $i -s $SUBJ YourEmail@(E)mail.com
2010-03-20 16:49:19
User: tboulay
Functions: echo split
-1

This is just a little snippit to split a large file into smaller chunks (4mb in this example) and then send the chunks off to (e)mail for archival using mutt.

I usually encrypt the file before splitting it using openssl:

openssl des3 -salt -k <password> -in file.tgz -out file.tgz.des3

To restore, simply save attachments and rejoin them using:

cat file.tgz.* > output_name.tgz

and if encrypted, decrypt using:

openssl des3 -d -salt -k <password> -in file.tgz.des3 -out file.tgz

edit: (changed "g" to "e" for political correctness)

(svnadmin dump /path/to/repo | gzip --best > /tmp/svn-backup.gz) 2>&1 | mutt -s "SVN backup `date +\%m/\%d/\%Y`" -a /tmp/svn-backup.gz emailaddress
2010-03-08 05:49:01
User: max
Functions: dump gzip
1

Dumps a compressed svn backup to a file, and emails the files along with any messages as the body of the email

tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5 -type f` 2> /dev/null | parallel -X rm -f
2010-01-28 12:41:41
Functions: rm tar
-3

This deals nicely with files having special characters in the file name (space ' or ").

Parallel is from https://savannah.nongnu.org/projects/parallel/

find . -iname "*.jpg" -print0 | tr '[A-Z]' '[a-z]' | xargs -0 cp --backup=numbered -dp -u --target-directory {location} &
2009-12-10 08:47:04
User: oracular
Functions: cp find tr xargs
4

Use if you have pictures all over the place and you want to copy them to a central location

Synopsis:

Find jpg files

translate all file names to lowercase

backup existing, don't overwrite, preserve mode ownership and timestamps

copy to a central location

backup() { for i in "$@"; do cp -va $i $i.$(date +%Y%m%d-%H%M%S); done }
2009-11-10 20:59:45
User: polaco
Functions: cp date
Tags: backup copy date
4

This script creates date based backups of the files. It copies the files to the same place the original ones are but with an additional extension that is the timestamp of the copy on the following format: YearMonthDay-HourMinuteSecond

tar pzcvf /result_path/result.tar.gz /target_path/target_folder
2009-11-10 11:17:00
User: CafeNinja
Functions: tar
0

The command as given would create the file "/result_path/result.tar.gz" with the contents of the target folder including permissions and sub- folder structure.

mysqldump -uUSERNAME -pPASSWORD database | gzip > /path/to/db/files/db-backup-`date +%Y-%m-%d`.sql.gz ;find /path/to/db/files/* -mtime +5 -exec rm {} \;
python fsrecovery.py -P 0 -f <path-to-instance>/Data.fs <path-to-instance-destination>/Data.fs.packed
tar czf /path/archive_of_foo.`date -I`.tgz /path/foo
2009-09-07 05:45:33
Functions: tar
Tags: backup tar
1

creates a compressed tar archive of files in /path/foo and writes to a timestamped filename in /path.

tar --create --file /path/$HOSTNAME-my_name_file-$(date -I).tar.gz --atime-preserve -p -P --same-owner -z /path/
2009-09-07 04:52:12
User: Odin_sv
Functions: date tar
Tags: backup tar
1

Use tar command for a backup info with a date of creation

dd if=/dev/cdrom of=whatever.iso
2009-09-05 09:19:41
User: 0disse0
Functions: dd
Tags: backup dd iso dvd
7

A dear friend of mine asked me how do I copy a DVD to your hard drive? If you want to make a copy of the ISO image that was burned to a CD or DVD, insert that medium into your CD/DVD drive and (assuming /dev/cdrom is associated with your computer?s CD drive) type the following command

sudo dd if=/dev/hda1 of=/dev/hdb2
2009-09-05 09:16:52
User: 0disse0
Functions: dd sudo
5

This command clone the first partition of the primary master IDE drive to the second partition

of the primary slave IDE drive (!!! back up all data before trying anything like this !!!)

eval $(sed -n "s/^d[^D]*DB_\([NUPH]\)[ASO].*',[^']*'\([^']*\)'.*/_\1='\2'/p" wp-config.php) && mysqldump --opt --add-drop-table -u$_U -p$_P -h$_H $_N | gpg -er AskApache >`date +%m%d%y-%H%M.$_N.sqls`
2009-08-18 07:03:08
User: AskApache
Functions: eval gpg sed
3

The coolest way I've found to backup a wordpress mysql database using encryption, and using local variables created directly from the wp-config.php file so that you don't have to type them- which would allow someone sniffing your terminal or viewing your shell history to see your info.

I use a variation of this for my servers that have hundreds of wordpress installs and databases by using a find command for the wp-config.php file and passing that through xargs to my function.

curl http://www.commandlinefu.com/commands/by/<your username>/rss|gzip ->commandlinefu-contribs-backup-$(date +%Y-%m-%d-%H.%M.%S).rss.gz
2009-08-10 12:43:33
Functions: date gzip
10

Use `zless` to read the content of your *rss.gz file:

zless commandlinefu-contribs-backup-2009-08-10-07.40.39.rss.gz
lomount -diskimage /path/to/your/backup.img -partition 1 /mnt/foo
4

Instead of calculating the offset and providing an offset option to mount, let lomount do the job for you by just providing the partition number you would like to loop mount.

/opt/psa/bin/pleskbackup server -v --output-file=plesk_server.bak
rsync -rtvu --modify-window=1 --progress /media/SOURCE/ /media/TARGET/
2009-07-05 07:40:10
User: 0x2142
Functions: rsync
Tags: backup rsync
12

This will backup the _contents_ of /media/SOURCE to /media/TARGET where TARGET is formatted with ntfs. The --modify-window lets rsync ignore the less accurate timestamps of NTFS.

dump -0 -M -B 4000000 -f /media/My\ Passport/Fedora10bckup/root_dump_fedora -z2 /
2009-07-02 20:25:22
User: luqmanux
Functions: dump
Tags: backup
2

This will compress the root directory to an external hard drive and split it to parts once it reaches the 4 Gigs file system limit.

You can simply restore it with:

restore ivf /media/My\ Passport/Fedora10bckup/root_dump_fedora
cd <YOUR_DIRECTORY>; for i in `ls ./`; do tar czvf "$i".tar.gz "$i" ; done
2009-06-11 18:33:27
User: ElAlecs
Functions: cd tar
-3

Very simple and useful, you need to change the word "directory" for your directory

tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5` 2> /dev/null | xargs rm -fr ;
2009-05-26 17:15:52
User: angleto
Functions: rm tar xargs
Tags: backup
7

create an archive of files with access time older than 5 days, and remove original files.