You are not logged in.
I want to write a backup script and pretty much unanimous recommendation I've read when searching for the best solution is to use rsync. The problem I have that rsync doesn't seem to compress the destination. There is a -z option, but it appears to apply to compression during file transfer, not the destination. The second part of the problem is that I cannot simply compress the resulting directory just like that, because I really like the solution of incremental backup and hardlinking unchanged files. And hardlinks would be destroyed with compression. So thinking about it, I thought I might create some mountable file, which would preserve hardlinks in its filesystem, and could be compressed when unmounted. Another problem is that such decompressing and recompressing before and after mounting/backup seems really time consuming, so some on the fly solution would be best.
The question I wanted to ask was is this way of backup practical, recommended, good, ...?
It could be summed into: "How do I do hardlinked incremental backups with rsync and compressing the destination directory?"
Any advice would be appreciated...
Offline
Something like this perhaps? Script one runs on the computer(s) to be backed up and script two runs on your backup server to compress and archive them when there are changes. It's not perfect but it works for me. You might wish to do a "true" incremental backup to save some space but I worry about one of the daisy-chained files becoming corrupted and ruining a whole string of backups.
REMOTE='my_other_machine.local'
HOST_NAME=$(uname -n)
BACKUP="/home/backup/current/$HOST_NAME"
FLAGS="/home/backup/flags"
# File transfer and syncing options:
RSYNC_OPTS='-aqz --delete-after --delay-updates'
# Log of completed directory backups and completion time:
LOG_FILE='/var/log/lab-backup.log'
# Include locations (within /home) to be backed up here:
# *** THIS IS THE SECTION WHICH MUST BE CUSTOMIZED FOR EACH COMPUTER ***
# ************** INCLUDE CUSTOM LOCATIONS TO BACKUP BELOW **************
HOME_DIR="/home/user_name"
LOCATIONS='Desktop
Documents
Downloads
Pictures'
# Wait for archiving to finish if it was already in progress:
ssh $REMOTE 'while [ -e "/home/backup/flags/archiving" ] ; do
sleep 5
done'
# Inform the backup server that a file sync is in progress:
ssh $REMOTE touch "$FLAGS/$HOST_NAME"
# Make sure that the backup location for this machine exists on the server:
ssh $REMOTE mkdir -p "$BACKUP"
# Perform backup of all listed locations in archive mode:
for LOC in $LOCATIONS ; do
rsync $RSYNC_OPTS "$HOME_DIR/$LOC" "$REMOTE:$BACKUP"
done
# Inform the backup server that the file sync is complete:
ssh $REMOTE rm -f "$FLAGS/$HOST_NAME"
echo "$(date -u) --- Backup Complete" >> $LOG_FILE
BACKUP_PATH='/home/backup'
LOG_FILE="$BACKUP_PATH/archive/backup.log"
TIME=$(date +%s)
# Flag that the archiver process is starting:
touch "$BACKUP_PATH/flags/archiving"
cd $BACKUP_PATH/current
for COMPUTER in * ; do
# Do not start archiving if there is a file sync in progress:
while [ -e "$BACKUP_PATH/flags/$COMPUTER" ] ; do
sleep 5
done
CURRENT_FILE="$BACKUP_PATH/tmp/$COMPUTER-$TIME.txz"
tar -cJf $CURRENT_FILE $COMPUTER
CURRENT_HASH=$(sha256sum $CURRENT_FILE)
CURRENT_HASH=${CURRENT_HASH%%\ *}
PREVIOUS_FILE="$BACKUP_PATH/recent/$COMPUTER.txz"
if [ -e $PREVIOUS_FILE ] ; then
PREVIOUS_HASH=$(sha256sum $PREVIOUS_FILE)
PREVIOUS_HASH=${PREVIOUS_HASH%%\ *}
else PREVIOUS_HASH='0'
fi
if ! [ $CURRENT_HASH == $PREVIOUS_HASH ] ; then
if [ -e $PREVIOUS_FILE ] ; then
rm -f $PREVIOUS_FILE
fi
mv $CURRENT_FILE "$BACKUP_PATH/archive/"
ln "$BACKUP_PATH/archive/$COMPUTER-$TIME.txz" $PREVIOUS_FILE
CHANGES=true
echo "$(date -u) --- $COMPUTER - $TIME --- $CURRENT_HASH" >> $LOG_FILE
else rm -f $CURRENT_FILE
fi
done
if ! [ $CHANGES ] ; then
echo "$(date -u) --- No Changes" >> $LOG_FILE
fi
# Clean up flags:
rm -f "$BACKUP_PATH/flags/archiving"
If quantum mechanics hasn't profoundly shocked you, you haven't understood it yet.
Niels Bohr
Offline