You are not logged in.
Pages: 1
I've made a script which backup my multiple subversion repos, compress them (tar.gz or 7z) and send them to multiple computers via rsync or scp. It might interest somebody...
Each user may have any number of repos. I have a directory "/svn" which contains symbolic link to the user's folder containing their repos. For example:
> ls /svn/
lrwxrwxrwx 1 root root 33 2007-10-03 12:26 user1 -> /home/user1/Documents/svn_repos
lrwxrwxrwx 1 root root 22 2007-11-26 14:58 user2 -> /home/user2/svn_repos/
lrwxrwxrwx 1 root root 25 2008-01-09 17:23 user3 -> /home/user3/MyFiles/Code/svn_repos/
> ls /svn/user2/
drwxrwxr-x 7 user2 svn 4.0K 2008-01-09 17:46 repo1
drwxrwxr-x 7 user2 svn 4.0K 2008-01-09 17:00 repo2
The script will find each repos of each user, dump the repo with "svnadmin dump" to a file named with the repo name with a suffix of the last revision of the repo, compress the dump (using 7zip or tar.gz) and send the directory structure to different servers.
Note that I put that script in /etc/cron.daily/ to have daily backups. Because the dump files are named "<repo name>_r<last revision>.dump", you won't waste all your disk space because if you did not commit to your repo, the new backup will be named the same as the old one and so it will overwrite it.
Also, note that since the script execute as a cron job and so as user root, root's /root/.ssh/config needs to have all necessary options to connect to "server1" and others. Also, to connect without a password, you should copy the machine your running the script's /root/.ssh/id_dsa.pub to the servers /home/$me/.ssh/authorized_keys
v0.3 - 2008 01 10
-Now uses rsync to save bandwith and time. rsync needs to be installed on the servers.
-The tmp files are not deleted by default anymore. So if the dump file still exist and is of the same revision the next time the script is run, the dump wont happen to save time.
-The compressed dump is checked with md5 to see if it still valid, else do it again.
v0.1 - 2008 01 09
-Initial release
-No rsync, only scp.
-Do everything all the time. Time consuming (compression+bandwith)
Here it is:
#!/bin/bash
# Written by Nicolas Bigaouette
# nbigaouette@gmail.com
# January 10th 2008
# v0.3
# http://bbs.archlinux.org/viewtopic.php?id=42083
# To restore :
# svnadmin load /path/to/reponame < /path/to/repo.dump
# http://wiki.archlinux.org/index.php/Subversion_backup_and_restore
now=`date +%Y%m%d_%Hh%M`
# Path containing links to folder containing repos
repos_path="/svn"
# Name of folder that will be created on remote hosts
svn_backup=optimusprime_svn_backup
# Logfile
logfile=/tmp/${svn_backup}/backup_of_${now}.log
# Main user (for permissions)
me=big_gie
local_backup=/home/${me}/fichiers/backups/svn_repos
# Compressiong type (tar.gz or 7z)
compression="7z"
# Sending type (scp or rsync)
sending="rsync"
# SCP options
scp_options="-C -r -p"
# Rsync options
rsync_options="-rvzthP"
# Server hosts and their respective path to send
servers_hosts=(
"$me@server1:/home/$me/fichiers/backups/svn_repos_op/"
"$me@server2:/home/$me/backups/svn_repos_op/"
"$me@server3:/home/$me/fichiers/backup/svn/"
)
if [ `printf '%c' $0` == "/" ]; then
script=`dirname $0`
else
script=`pwd`"/"`dirname $0`
fi
script="$script/`basename $0`"
backup() {
mkdir -p /tmp/${svn_backup} || return 2
echo ${now} > ${logfile}
cd ${repos_path}
for fullname in *; do
# Iteration through users
cd /tmp/${svn_backup} || return 3
username=`basename ${fullname}`
# Search for svnserve.conf files, and extract the repo name from its path
repos=`find -L ${repos_path}/${username} -name svnserve.conf | sed "s|/conf/svnserve.conf||g"`
mkdir -p ${username}
cd ${username}
echo "*********************************************************" | tee -a ${logfile}
echo "*********************************************************" | tee -a ${logfile}
echo "Backing up ${username}'s repos..." | tee -a ${logfile}
echo "*********************************************************" | tee -a ${logfile}
for repo in ${repos}; do
# Iteration through user's repo
# Get last repo revision
repo_last_rev=`svn log file://${repo} --limit 1 | sed -n 's/^r\([^ ]*\) .*$/\1/p'`
repo_basename=`basename ${repo}`
mkdir -p ${repo_basename}
cd ${repo_basename}
# Name of dump file
dump_name="${repo_basename}_r${repo_last_rev}" || return 4
# Verify if dump is to be done
if [ -s ${dump_name}.dump.*.md5 ]; then
md5sum_ok=`md5sum --status -c ${dump_name}.dump.*.md5`
if [[ "${md5sum_ok}" == "1" ]]; then
do_dump=1 # Md5sum is bad, re-dump.
else
do_dump=0 # Md5sum is ok, don't re-dump.
fi
else
do_dump=1 # Md5sum file is not present, re-dump.
fi
if [[ "$do_dump" == "1" ]]; then
# Dump the repo and md5sum it
echo "*********************************************************" | tee -a ${logfile}
echo " (1/2) Dumping ${username}'s ${repo_basename}..." | tee -a ${logfile}
# Extract repo info and save it to log file
#svn info file://${repo} >> ${logfile}
svn info svn+ssh://optimusprime.selfip.net${repo} | tee -a ${logfile}
svnadmin dump ${repo} > ${dump_name}.dump | tee -a ${logfile} || return 5
#touch ${dump_name}.dump | tee -a ${logfile} || return 5
chown ${me}:users ${dump_name}.dump
md5sum ${dump_name}.dump > ${dump_name}.dump.md5
echo " (2/2) Compressing ${username}'s ${repo_basename}..." | tee -a ${logfile}
if [[ "$compression" == "tar.gz" ]]; then
# Tar.gz compression
tar --remove-files -zcvf ${dump_name}.dump.tar.gz \
${dump_name}.dump ${dump_name}.dump.md5 || return 6
md5sum ${dump_name}.dump.tar.gz > ${dump_name}.dump.tar.gz.md5
elif [[ "$compression" == "7z" ]]; then
# 7zip compression
7z a ${dump_name}.dump.7z ${dump_name}.dump ${dump_name}.dump.md5
rm -f ${dump_name}.dump ${dump_name}.dump.md5
md5sum ${dump_name}.dump.7z > ${dump_name}.dump.7z.md5
fi
else
echo " ${repo_basename}: No dump needed."
fi
cd ..
done
echo "*********************************************************" | tee -a ${logfile}
echo "Done backing up ${username}'s repos..." | tee -a ${logfile}
done
echo "*********************************************************" | tee -a ${logfile}
echo "*********************************************************" | tee -a ${logfile}
# Copy of the script
cp -f $script /tmp/${svn_backup}/
# Local copy
cmd_local="cp -R /tmp/${svn_backup} ${local_backup}"
echo "Copying files locally..." | tee -a ${logfile}
echo ${cmd_local} | tee -a ${logfile}
${cmd_local}
chown -R ${me}:users ${local_backup}
return 0
}
sending_scp() {
cd /tmp || return 7
# SCP commands
for i in `seq ${#servers_hosts[*]}`; do
scp_cmd="scp ${scp_options} ${svn_backup} ${servers_hosts[i-1]}"
echo "Sending archive to ${servers_hosts}..." | tee -a ${logfile}
echo ${scp_cmd} | tee -a ${logfile}
${scp_cmd} || return 8
done
return 0
}
sending_rsync() {
cd /tmp || return 7
# Rsync commands
for i in `seq ${#servers_hosts[*]}`; do
rsync_cmd="rsync ${rsync_options} ${svn_backup} ${servers_hosts[i-1]}"
echo "Sending archive to ${servers_hosts}..." | tee -a ${logfile}
echo ${rsync_cmd} | tee -a ${logfile}
${rsync_cmd} || return 9
done
return 0
}
clean_tmp() {
rm -fr /tmp/${svn_backup} || return 10
return 0
}
# Call the backup function
backup
# Check for errors
return_value=$?
if [[ "${return_value}" != 0 ]]; then
echo "ERROR: The backup failed at ${return_value}"
fi
# Send the files
if [[ "$sending" == "scp" ]]; then
sending_scp
elif [[ "$sending" == "rsync" ]]; then
sending_rsync
fi
# Check for errors
return_value=$?
if [[ "${return_value}" != 0 ]]; then
echo "ERROR: The backup failed at ${return_value}"
else
echo "Backup of subversion repos successful"
# Clean tmp
#clean_tmp
fi
Last edited by big_gie (2008-05-13 14:14:29)
Offline
Pages: 1