Optimize your DirectAdmin backups with rSync

DirectAdmin offers some great backup which makes it possible to create a backup file for each user or hosting account. The control panel has backup functions for the different access levels:

  • Admin level: Create system or whole server backups
  • Admin level: Create backups for all users and resellers
  • Reseller level: Create backups for all users owned by the reseller
  • User level: Create backups for different services (website, e-mail, database…)

Create complete and compact backups

While the backup files created by a user remain one the local server, is it possible to copy those backups, created by the admin or reseller user, to a remote backup location. Each backup file (created by the admin or reseller) contains: The files from the website, a database dump file, the e-mail accounts (including data), the DNS records and all other user related settings/stats. Using DirectAdmin’s backup/restore function, it’s very easy to move complete websites:

  1. Create a backup and store the file on your remote backup storage platform and wait for the confirmation e-mail that your backup is ready
  2. Login to the new/second DirectAdmin server and restore the account from the remote location (don’t forget to change the server’s IP address)
  3. After you got the conformation e-mail that your account is restored, just change the A records or the name server for your domain name.

With this functionality it’s possible to move website between servers without a second of downtime! Optional you can change the A records from the first server to point them to the new server’s IP address.

High server load on backup creation

The backup function is great to move websites and also for your daily backup if the accounts are not to big. Just in case you need to create several bigger backup files (f.e. 1GB), your server will become slow until the whole backup process is finished.

Backup plan to lower the server load

Using rsync it’s possible to create incremental backups for your whole server, but since good backup storage is often not cheap we want to backup only the important data. The whole backup process is about:

  • We create a weekly or monthly backup using the native DirectAdmin functions
  • We need to create a script to export MySQL data dump file from all databases
  • We need to create a script that copies all files to a remote location using rsync
  • We need to create a CRON job for both scripts
  • We need to check the backup vitality ones a week or month

It’s up to you if you want to create a monthly or weekly backup within DirectAdmin (the same for how often you check your backups). Run all backup tasks in a time frame where most of your visitors are offline.

Complete DirectAdmin user account backups ones in week or month

Login to DirectAdmin as the admin and follow the link “Admin Backup/Transfer” and select those users you like to backup. For the option “CRON Schedule” choose a moment where your visitors are offline and enter the remote backup location (EVBackup from ExaVault is a service we use) for the FTP login form. Click the submit button and your backup is created ones a month or week. You get a mail message when a backup is created.

Create MySQL dump files from all databases

There is a global admin account for the MySQL service on your DA server, you can find the user name and password in: /usr/local/directadmin/conf/
Using this login/password, it’s possible to create MySQL dumps for all databases on your server. Create a file with the following code and name it dbbackup.sh.

#!/bin/sh

DBUSER="da_admin"
DBHOST="localhost"
DBPASS="thepassword"
BACKUPDIR="dbbackup"

DBS=`mysql -u$DBUSER -h$DBHOST -p$DBPASS -e"show databases"`

for DATABASE in $DBS
do
if [ $DATABASE != "Database" ]; then
FILENAME=$DATABASE.gz
mysqldump -u$DBUSER -h$DBHOST -p$DBPASS $DATABASE | gzip --best > $BACKUPDIR/$FILENAME
fi
done

Save the file in the /home/admin/ directory and create in the same directory another one called “dbbackup”. If you run this script from the command line, all gzipped MySQL dump files are created inside the new created directory.

Copy your files using rsync

Before you can start using rsync you need to setup a private key which is needed for the SSH connection. Enter the following commands via the command line from your server:

sudo ssh-keygen -f /home/admin/ssh_key -t rsa -N ''
sudo rsync -e ssh /home/admin/ssh_key.pub username@username.exavault.com:ssh_keys/key1.pub
ssh username@username.exavault.com addkeys

The commands are based on the instructions from the Exavault website, check their tutorial for further information. Next we need to create a script which will synchronize your files with those files on your remote backup location, create a file under the name databackup.sh inside your admin’s home directory with this code:

rsync -avz --delete --one-file-system -e "ssh -i /ssh_key" "/home" USERNAME@USERNAME.exavault.com:backup-1
rsync -avz --delete --one-file-system -e "ssh -i /ssh_key" "/home/admin/dbbackup" USERNAME@USERNAME.exavault.com:backup-1

The first command will copy all user directories to the remote backup location and the second one will copy all MySQL dump files.

Create CRON jobs for your daily backups

The last step we need to do is the setup of some CRON jobs. Login to the console and edit the root’s crontab file with: sudo crontab -e 
Next add these two CRON jobs to the list and the CRON demon will create at 5am the MySQL dump files first and than 30 minutes later the data backup script is executed:

0 5 * * * sh /home/admin/backup/dbbackup.sh
30 5 * * * sh /home/admin/backup/databackup.sh

 That’s all, don’t forget to test all your backups frequently!

Published in: Web Hosting

4 Comments

  1. There is a very good script for rsync backups with error reporting and even an rss of your backups. It is called synbak google it.

    1. Thanks for sharing, but synbak is more a backup system you can use for your “desktop”. In this article we need to use rsync as a cronjob.

  2. It seems that EVBackup doesn’t seem to accept the remote backup when I tried to backup my files using port 22. Things works fine with port 21 though.

    1. I don’t think that this problem is related to the backup software, I think port 22 isn’t open and you have to use port 21 for all SSH traffic.
      Great that it work for you Francis :)

Comments are closed.