You are not logged in.
It's become increasingly salient to me that I need some way to recover my system in the event of hardware failure. I've been using BTRFS for its subvolume and snapshot features, but gotten burned several times by my lack of external backups when my main drive started dying. I've struggled to find clear instructions to explain how to back up, test, and restore from a BTRFS system.
Ideally, I will be able to back up my entire system to the external drive so when SHTF I can just reboot/restore a new drive from my backup; and ideally the backup is bootable or easily restorable so I can test the recovery process without risking my data. Many tools out there aren't great ideas to use, from what I can tell (dd has issues with uuids and disk extent when cloning a full disk, and other backup utilities aren't aware of subvolume structures, which is suboptimal). Programs do exist that appear to help with producing external backups of a BTRFS filesystem with snapshots (btrfs-clone, https://github.com/mwilck/btrfs-clone; snap-sync, https://github.com/baodrate/snap-sync and its fork dsnap-sync; btrbk, https://github.com/digint/btrbk), but it's not clear to me how I would use the results produced by these to restore a system. Their use of btrfs-send also means my filesystem must be offline during the process, as far as I can tell.
Does anyone want to share a system that they've come up with and used for producing full system backups of a BTRFS system and restoring from them? Am I even on the right track here looking for a bootable/recoverable backup workflow for BTRFS, considering I also would have an EFI partition to restore and an fstab to fix either way? I'm just looking for something more than backing up files, that way I don't have to spend weeks trying to remember every little tweak and fix I've had to apply to my system since I built it.
Offline
This may not be the answer you're looking for, but given this is 3 days old and no one else has answered it, I will at least post my two cents.
I use github, or gitlab if you prefer, to back up all of the fixes and dot files that I make. I make 3 separate text files.
1. Contains notes of commands, or file locations, or weird workarounds I may forget over time.
2. Contains a list of all the extra software I use, that way I can simply copy & paste it into a
pacman -S *pasted contents*
3. A readme containing instructions how to get it all going, should you for any reason forget.
All I do is if I have an issue that forces a new storage device or what have you, I'll just simply install arch using the installer via USB, then git clone my git repo, and put them where they belong, reboot, all good.
With decent internet, takes about 5m to do this, so not that bad.
You could even go so far as to write a small batch script that will do all of that FOR you. I've seen that a lot as well. Though, my git is a revolving door of software and random codes so I don't feel like keeping up with updating parameters of an install script every time, 'course there's ways around that as well, like programming smarter and structuring your git repo in a more modular way.
As for your precious data like pictures, sounds, documents, whatever, just make a copy of your entire disk image somewhere else, simple as that. Though, I hate doing that because recovery takes ages. I kinda just prefer hand selecting the things I explicitly want backed up, and putting those on a separate hard-drive. If it's encrypted and jazz well idk, I don't ever deal with encrypting my stuff as I'm just your average user. So for me, it's literally just as simple as plugging it in, mounting it, moving the data over, unplugging it, and putting it back in safe storage.
More specifically, I have a 5tb hard-drive that's formatted in NTFS and not encrypted, so windows can access it(I do game dev and like to make sure things do in fact run on Windows as well, so naturally I wanna back things up from it occasionally too)
There maybe much better ways of going about this or even a proper/official way to back it up, but this is how I do it. Been doing it for years now and it works for me and my workflow, mostly because it's not very often I actually need to back up my data, but rather it's more often I just need to backup my operating system and it's fixes and programs(because for the most part I actually keep most of my data on a LAN server)
If that isn't the case for you, and you just want a singular backup containing all of your stuff, well, I'd go with the "backing everything up on hard drive" method. It's a bit old school, but it certainly works.
Though, another benefit to doing it via git and saving your data else where, is that you can take those .files and config files, and all of your various fixes with you everywhere you go, without needing to pull the rest of the data with it. Just a small perk, to me personally, as it seems like I have a new storage device every other month. Not to mention, you can hand select pieces of parts of it to use. So if, for example, you have on PC with an NVidia card and applied tons of fixes for Wayland, but have another PC with an AMD card, but want the same experience on both without having to manually recreate it, just simply git clone everything except the NVidia fixes
It really is quite awesome.
Again, I don't know if this really answers your question but it's what I do.
Last edited by SamuTheFrog (2024-06-29 18:45:53)
Offline
I don't use BTRFS but have found rsync file system cloning very useful for backup and migrating to new hardware .
One time the sata chipset of my 4 hdd system using lvm broke down, making the system unusable .
I bought usb 2 sata enclosures and a usb hub with enough ports , accessed them with my old laptop.
I had also gotten a 2 TB usb drive (large enough to hold everything) and used rsync file system cloning to the content of the 4 hdds to the new drive.
Had to change bootloader and fstab values , then could boot from the new drive and had my complete installation back.
the laptop was approx 12 years old and had only usb 1.1 ports, so this took a lot of patience.
The system was very slow but functional .
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
When I moved from one SSD to another I just rsync'd to an external hard drive, then rsync'd back to the new SSD. I wasn't very knowledgeable about the process so I didn't exclude /tmp, /sys, and so on, but it worked perfectly anyway (after adjusting the partition/drive ID). You do need to make sure that you preserve permissions and extended attributes though. The wiki recommends "rsync -aAXHv"; I add a lowercase x to also avoid crossing filesystem boundaries, which you might or might not want for a backup.
I am on ext4 but I don't think rsync would be any different with btrfs.
Offline