You are not logged in.

#1 2013-11-08 08:44:05

pentago
Member
From: Serbia
Registered: 2013-10-31
Posts: 29
Website

[SOLVED] File backup strategy

Hi,

I have a cronjob running backup script every 8hrs to make timestamped tar archive of important stuff.
Basically, i just wanted to know if there's some more elegant way of achieving the same/better effect as, for some reason, i don't feel this is the right way of doing it.

0 */8 * * * sh /home/pentago/.backup/cron-backup > /dev/null 2>&1

Backup script:

#!/bin/zsh

cd /home/pentago/.backup  && \
tar cvpf backup-`date +%d.%m.%y--%I.%M%p`.tar $(cat /home/pentago/.backup/files) \
--exclude="/home/pentago/Public" \
--exclude="/home/pentago/.irssi/logs" \
--exclude="/home/pentago/.config/google-chrome" \
\
--exclude="/home/pentago/.config/sublime-text-3/Cache" \
--exclude="/home/pentago/.config/sublime-text-3/Backup" \
--exclude="/home/pentago/.config/sublime-text-3/Index" \
\
--exclude="/home/pentago/.Skype/shared_dynco" \
--exclude="/home/pentago/.Skype/shared_httpfe" \
--exclude="/home/pentago/.Skype/DbTemp" \
--exclude="/home/pentago/.Skype/shared.lck" \
--exclude="/home/pentago/.Skype/user.name/chatsync" \
--exclude="/home/pentago/.Skype/user.name/httpfe" \
--exclude="/home/pentago/.Skype/user.name/voicemail" \
--exclude="/home/pentago/.Skype/user.name/*.db" \
--exclude="/home/pentago/.Skype/user.name/*.db-journal" \
--exclude="/home/pentago/.Skype/user.name/*.lock" \
--exclude="/home/pentago/.Skype/user.name/*.lck" \
\
--exclude="/home/pentago/.Skype/user.name2/chatsync" \
--exclude="/home/pentago/.Skype/user.name2/httpfe" \
--exclude="/home/pentago/.Skype/user.name2/voicemail" \
--exclude="/home/pentago/.Skype/user.name2/*.db" \
--exclude="/home/pentago/.Skype/user.name2/*.db-journal" \
--exclude="/home/pentago/.Skype/user.name2/*.lock" \
--exclude="/home/pentago/.Skype/user.name2/*.lck" \

Backed up files:

/home/pentago/Downloads
/home/pentago/Music
/home/pentago/Pictures
/home/pentago/Public
/home/pentago/Videos
/home/pentago/.config
/home/pentago/.asoundrc
/home/pentago/.elinks
/home/pentago/.filezilla
/home/pentago/.fonts
/home/pentago/.gimp-2.8
/home/pentago/.irssi
/home/pentago/.mpd
/home/pentago/.mpdconf
/home/pentago/.packages
/home/pentago/.share-credentials
/home/pentago/.Skype
/home/pentago/.ssh
/home/pentago/.vim
/home/pentago/.gtk-bookmarks
/home/pentago/.tmux.conf
/home/pentago/.vimrc
/home/pentago/.xinitrc
/home/pentago/.Xresources
/home/pentago/.zshrc
/etc/fstab
/etc/hostname
/etc/locale.conf
/etc/locale.gen
/etc/locale.nopurge
/etc/localtime
/etc/ntp.conf
/etc/ssh/sshd_config
/etc/pacman.conf
/etc/makepkg.conf
/etc/netctl/wlp3s0
/etc/netctl/enp4s0
/etc/pacman.d/mirrorlist
/etc/samba/smb.conf
/etc/ufw/ufw.conf
/etc/vconsole.conf
/etc/X11/xorg.conf.d/50-synaptics.conf
/usr/lib/ufw/user.rules
/usr/lib/ufw/user6.rules

Thanks.

Last edited by pentago (2013-11-22 18:29:52)

Offline

#2 2013-11-08 10:51:49

graysky
Member
From: /run/user/1000
Registered: 2008-12-01
Posts: 8,444
Website

Re: [SOLVED] File backup strategy

Keep in mind that if you're putting the backup tar files on the same disk, you risk losing them in a failure.  True backups are on different machines located in different physical locations (safe guarding against fires, theft, etc.).


CPU-optimized Linux-ck packages @ Repo-ck  • AUR packagesZsh and other configs

Offline

#3 2013-11-08 11:15:56

mr.MikyMaus
Member
From: +3600 UT
Registered: 2006-03-31
Posts: 270

Re: [SOLVED] File backup strategy

You may want to consider using other utility, such as rdiff-backup for making incremental backups. Doing a full backup every 8 hours just takes humongous amount of precious hard drive space, not to mention I/O and thus time.

Also, as graysky pointed out, use a different device for backups, best have something off-site.

-m.


What happened to Arch's KISS? systemd sure is stupid but I must have missed the simple part ...

... and who is general Failure and why is he reading my harddisk?

Offline

#4 2013-11-08 11:24:23

graysky
Member
From: /run/user/1000
Registered: 2008-12-01
Posts: 8,444
Website

Re: [SOLVED] File backup strategy

backintime in the AUR is a nice alternative to rdiff-backup as well.


CPU-optimized Linux-ck packages @ Repo-ck  • AUR packagesZsh and other configs

Offline

#5 2013-11-08 11:51:52

mr.MikyMaus
Member
From: +3600 UT
Registered: 2006-03-31
Posts: 270

Re: [SOLVED] File backup strategy

Forgot to mention encryption. When backing up off-site to an untrusted storage (say Dropbox, or other proprietary cloud, VPS, etc..), encryption should be in place if you value your privacy and abide copyright laws (some may prohibit you of making copies of multimedia and/or software - I know, stupid, but...)


What happened to Arch's KISS? systemd sure is stupid but I must have missed the simple part ...

... and who is general Failure and why is he reading my harddisk?

Offline

#6 2013-11-08 12:05:19

bulletmark
Member
From: Brisbane, Australia
Registered: 2013-10-22
Posts: 241

Re: [SOLVED] File backup strategy

I've used rsnapshot for backups for years. Still not seen anything better.

Offline

#7 2013-11-08 12:39:01

stqn
Member
Registered: 2010-03-19
Posts: 1,189
Website

Re: [SOLVED] File backup strategy

I use rsync first to copy changed stuff to another disk, then tar the copy to keep a few backups at different points in time.

I used to use rdiff-backup but didn’t find it very practical (it was also very slow but I can’t say if my current method is faster).

Offline

#8 2013-11-12 16:53:59

pentago
Member
From: Serbia
Registered: 2013-10-31
Posts: 29
Website

Re: [SOLVED] File backup strategy

rsnapshot is nice but i prefer my backups small and full every time.
As my backups are only /home config files and couple of /etc system configs they are usually very small, ~30megs which is far more convenient than backing up the whole disk which is way to big to have it backed up entirely. I hold important stuff like music and photos collection on my laptop as well, i don't care for other stuff like software, etc..

These tarballs are made in directory which is synced to Dropbox so it's off site basically (thanks for worrying @graysky) smile

My original post has something else in mind, to figure out whether the script could be written better but when i saw that rsnapshot, which is clearly awesome project, use the similar logic i gave up on it.
Tar is the way to go for me due to it's convenience and flexibility.

Offline

#9 2013-11-22 18:29:30

pentago
Member
From: Serbia
Registered: 2013-10-31
Posts: 29
Website

Re: [SOLVED] File backup strategy

This is what i came up with and it works beautifully..
https://coderwall.com/p/s0x1ga?i=1&p=1& … 5D=pentago

Offline

#10 2013-12-05 05:28:26

dgbaley27
Member
Registered: 2011-07-22
Posts: 47

Re: [SOLVED] File backup strategy

As graysky said, this isn't exactly a backup, it's more a history of your files. If you're willing to use BTRFS (I am on many computers without a problem) you might be better off using subvolumes and snapshots. I setup a script which takes snapshots of my home directory which is called through cron like:

   */10    *      *       *       *           snap  10m  12
     0     *      *       *       *           snap  1h   24
     0    0,12    *       *       *           snap  hd   14
     0     13     *       *       *           snap  1d   30

The first argument to snap is an arbitrary label and the second is the number of copies of that label to keep (first in first out). So I keep 2 hours of 10-minute snapshots and 2 weeks of half-day snapshots of my home directory. Also, I have ~/Local and ~/.cache as separate subvolumes so that they are not included.

I also take snapshots in two other places:
- Whenever I log in I run "snap profile 5" so that I have a snapshot created for my previous 5 logins (tty, X11, ssh, tmux, etc).
- Whenever I run my personal rsync wrapper to another host.

Elaborating in the second item, I have 5 machines which are configured in the same way. I manually run my rsync wrapper to sync among them (so I usually need to keep track of my most recently used machine to avoid my files from diverging). The rsync wrapper calls "snap <source-hostname> 3" so that I also have snapshots created before I sync files between two hostnames.

So you get the idea, I currently have all of these snapshots:

@2013-11-17 00:00:01 MST - hd               @2013-11-25 18:58:13 MST - profile  @2013-12-03 17:00:01 MST - 1h      @2013-12-04 20:00:01 MST - 1h
@2013-11-17 12:00:01 MST - hd               @2013-11-25 19:01:41 MST - spruce   @2013-12-03 18:00:01 MST - 1h      @2013-12-04 20:30:01 MST - 10m
@2013-11-17 13:00:01 MST - 1d               @2013-11-26 10:32:28 MST - profile  @2013-12-03 19:00:01 MST - 1h      @2013-12-04 20:40:01 MST - 10m
@2013-11-18 00:00:01 MST - hd               @2013-11-26 10:33:19 MST - profile  @2013-12-03 20:00:01 MST - 1h      @2013-12-04 20:50:01 MST - 10m
@2013-11-19 00:00:01 MST - hd               @2013-11-26 12:26:14 MST - profile  @2013-12-03 22:00:01 MST - 1h      @2013-12-04 21:00:01 MST - 10m
@2013-11-22 00:00:01 MST - hd               @2013-11-26 13:00:01 MST - 1d       @2013-12-03 23:00:01 MST - 1h      @2013-12-04 21:00:01 MST - 1h
@2013-11-22 09:53:13 MST - sunflower        @2013-11-26 22:52:20 MST - profile  @2013-12-04 07:00:01 MST - 1h      @2013-12-04 21:10:01 MST - 10m
@2013-11-22 09:59:24 MST - sunflower        @2013-12-02 14:02:55 MST - profile  @2013-12-04 08:00:01 MST - 1h      @2013-12-04 21:20:01 MST - 10m
@2013-11-22 12:00:01 MST - hd               @2013-12-02 14:16:07 MST - profile  @2013-12-04 09:00:01 MST - 1h      @2013-12-04 21:30:01 MST - 10m
@2013-11-23 00:00:01 MST - hd               @2013-12-03 09:00:01 MST - 1h       @2013-12-04 10:00:01 MST - 1h      @2013-12-04 21:40:01 MST - 10m
@2013-11-24 13:00:26 MST - 1d               @2013-12-03 10:00:01 MST - 1h       @2013-12-04 11:00:01 MST - 1h      @2013-12-04 21:50:01 MST - 10m
@2013-11-24 20:30:27 MST - sunflower        @2013-12-03 11:00:01 MST - 1h       @2013-12-04 12:00:01 MST - 1h      @2013-12-04 22:00:01 MST - 10m
@2013-11-25 09:34:19 MST - profile          @2013-12-03 12:00:01 MST - 1h       @2013-12-04 12:00:01 MST - hd      @2013-12-04 22:00:01 MST - 1h
@2013-11-25 09:43:12 MST - old mutt config  @2013-12-03 12:00:01 MST - hd       @2013-12-04 15:53:35 MST - spruce  @2013-12-04 22:10:01 MST - 10m
@2013-11-25 09:43:51 MST - old mutt config  @2013-12-03 13:00:01 MST - 1d       @2013-12-04 16:00:01 MST - 1h      @2013-12-04 22:20:01 MST - 10m
@2013-11-25 12:00:01 MST - hd               @2013-12-03 13:00:01 MST - 1h       @2013-12-04 17:00:01 MST - 1h      @head
@2013-11-25 13:00:01 MST - 1d               @2013-12-03 16:31:21 MST - profile  @2013-12-04 18:00:01 MST - 1h
@2013-11-25 14:54:05 MST - profile          @2013-12-03 16:31:32 MST - spruce   @2013-12-04 19:00:01 MST - 1h

It seems like a lot, but they barely take up any space.

@head is my actual home directory and is the only read/write subvolume, all of the other snapshots are immutable.

Last edited by dgbaley27 (2013-12-05 05:30:27)

Offline

#11 2013-12-05 06:35:51

fukawi2
Forum Moderator
From: .vic.au
Registered: 2007-09-28
Posts: 5,275
Website

Re: [SOLVED] File backup strategy

dgbaley27 wrote:

As graysky said, this isn't exactly a backup, it's more a history of your files.

This is also not a backup and just a history of your files... There's no file recovery if your hard drive dies, or your house burns down.

Offline

#12 2013-12-05 10:23:12

pentago
Member
From: Serbia
Registered: 2013-10-31
Posts: 29
Website

Re: [SOLVED] File backup strategy

Thanks dgbaley27, looks like btrfs is becoming more and more interesting these days but i somehow settled with daily tar backups, really simple and almost effortless solution + you always have your data (backed to dropbox folder) even if drive dies..

Offline

#13 2013-12-05 15:11:14

dgbaley27
Member
Registered: 2011-07-22
Posts: 47

Re: [SOLVED] File backup strategy

fukawi2 wrote:
dgbaley27 wrote:

As graysky said, this isn't exactly a backup, it's more a history of your files.

This is also not a backup and just a history of your files... There's no file recovery if your hard drive dies, or your house burns down.

Two things. 1) I was suggesting a nicer history-keeping system to OP's tarball scheme. 2) My solution is a backup because I sync among multiple computers in 4 different states across the US (including both coasts and the Rockies). My only single-point of failure (if you want to call it that) is that I'm entirely using btrfs. I don't have any more ext4 volumes with my data. However, I do occasionally burn a DVD.

Last edited by dgbaley27 (2013-12-05 15:11:29)

Offline

#14 2013-12-12 10:47:53

Vamp898
Member
From: ドイツではまだ住んでいる
Registered: 2009-01-03
Posts: 866
Website

Re: [SOLVED] File backup strategy

bulletmark wrote:

I've used rsnapshot for backups for years. Still not seen anything better.

Thats quite sad. I also use rsnapshot and had to write a lot of wrapper scripts around it and i had to heavily customize it.

If you do backups of multiple machines, rsnapshot is just horrible.

Its not possible to backup multiple machines in parallel
If one of the machines is not reachable, none of the machines actually do any backup!

So rsnapshot only works if all machines are online, reachable and you have enough time to backup them one after another.

After using rsnapshot for about 6 Months i had it customised so heavily that i just replaced it by a own written backup script.

In the end cp -al and rsync is just everything you need.

cd /backup
cp -al `ls -r | head -n1` `date "+%Y%m%d"`
cd `ls -r | head -n1`
rsync whatever

Maybe some additional code to check errors, some email reporting an little stuff, and your done big_smile incremental backups, just like that.

If your old HDD is replaced with a new, just go into the directory with the latest backup, do an rsync to the new HDD and you're done.

Offline

#15 2013-12-12 12:40:19

bulletmark
Member
From: Brisbane, Australia
Registered: 2013-10-22
Posts: 241

Re: [SOLVED] File backup strategy

@Vamp898, rsnapshot implements a rollback strategy if a host is not available which uses the previous backup for that host. Read the HOWTO. Has always worked for me as often some of the windows pc's in my backup are not around.

Your only other complaint is that rsnapshot does not run in parallel, but neither does your home grown script! The point is that rsnapshot, i.e. rsync under the covers, only copies what changes so it is pretty fast.

Offline

#16 2013-12-12 23:45:13

fukawi2
Forum Moderator
From: .vic.au
Registered: 2007-09-28
Posts: 5,275
Website

Re: [SOLVED] File backup strategy

I have to +1 rsnapshot. Works well here, including only rolling back the 1 host if it's not available.

Offline

#17 2013-12-16 15:22:14

Vamp898
Member
From: ドイツではまだ住んでいる
Registered: 2009-01-03
Posts: 866
Website

Re: [SOLVED] File backup strategy

bulletmark wrote:

@Vamp898, rsnapshot implements a rollback strategy if a host is not available which uses the previous backup for that host. Read the HOWTO. Has always worked for me as often some of the windows pc's in my backup are not around.

Your only other complaint is that rsnapshot does not run in parallel, but neither does your home grown script! The point is that rsnapshot, i.e. rsync under the covers, only copies what changes so it is pretty fast.

Seems like we talk about different things.

If a host is not aviable, it can not be mounted using cifs (for example). If it can not be mounted, rsnapshot complains about that the directory which is going to be back upped does not exist (because it is not mounted) and says you most likely have an error in your configuration and just completely stops all backups.

My home grown script runs very fine parallel. Its just a very small cut out (as mentioned), but to write a for-loop running this for every machine in a list with some kind of check that not too many run in parallel is not really a work.

Its possible with 3-6 additional lines of code where its nearly impossible with rsnapshot. I think that is an difference to mention.

If you have a network with 50 Machines and one machine only have 1-2mb/s because there is a huge load on thise machine and much network i/o, all backups have to wait for this single machine. If this machine and its backup hungs up, the whole backup of all machine hungs up. Since the latest Windows 7 Updates i have the issue that single machine cause rsnapshot to hang endless on one backup.

If you start a backup at friday evening and mondey morning you see that he only worked through 3 computers and than hung, thats really nothing you want use.

Last edited by Vamp898 (2013-12-16 15:25:45)

Offline

#18 2013-12-16 22:08:53

fukawi2
Forum Moderator
From: .vic.au
Registered: 2007-09-28
Posts: 5,275
Website

Re: [SOLVED] File backup strategy

Vamp898 wrote:

[
If a host is not aviable, it can not be mounted using cifs (for example). If it can not be mounted, rsnapshot complains about that the directory which is going to be back upped does not exist (because it is not mounted) and says you most likely have an error in your configuration and just completely stops all backups.

I've never used rsnapshot that way - I always use rsync over SSH to backup remote machines. If the host is unavailable, SSH timesout and rsnapshot moves on to the next host.

Your point about slow transfers from 1 heavily loaded machine is valid, but affects more backup solutions than not I'd imagine.

Offline

#19 2013-12-16 22:28:24

bulletmark
Member
From: Brisbane, Australia
Registered: 2013-10-22
Posts: 241

Re: [SOLVED] File backup strategy

Rsnapshot is just a wrapper around rsync and it is ridiculous to use mighty and efficient rsync on a network mounted file system because all files will have to be traversed over the network. No wonder such a backup arrangement is slow! Rsync should be installed on all client machines and then let rsnapshot pull the backups (i.e. delta changes) from those clients. I install the free deltacopy (windows rsync client) on my windows PC's for this.

BTW, if you require mounted directories etc for your backup then your invocation (e.g. cron?) script should be checking those mounts before starting rsnapshot.

Offline

#20 2013-12-17 08:49:45

Vamp898
Member
From: ドイツではまだ住んでいる
Registered: 2009-01-03
Posts: 866
Website

Re: [SOLVED] File backup strategy

> I always use rsync over SSH to backup remote machines.

Sadly the Windows informatic here dont want to install the badly patched and hacked rsync Windows Version, nor any SSH Server on the Client Machines.

But as said again, rsnapshot is just cp -al and rsync. Why should i want to hack around in rsnapshot or build scripts around rsnapshot when i can write an own backup script in less time which works exactly the way i want to.

In my personal opinion i dont see any use-case for an perl-script, wrapped around rsync and cp as those two tools are easy enough to be used without such an inflexible piece of software.

- Create a list with clients
- Create a script which creates directories with cp -al for all those clients in a for-loop
- Run rsync on the new created directories
- your done

> Your point about slow transfers from 1 heavily loaded machine is valid, but affects more backup solutions than not I'd imagine.

Which is even worse as an <100 line shell script doesnt have this problem

Offline

Board footer

Powered by FluxBB