My Apache and MySQL backup setup

Published: 6 years ago web dev

Having witnessed the carnage of not having adequate backup strategies, I've more than recognised the importance of simple and reliable backups. There are no backup products (as such) in use here and no data deduplication, just some bash scripts, a cloud storage provider and rclone. The end result is a local and remote copy of tarred and gzipped backups that are stored by day, by week and by month, retained by a definable period. The backups contain the web folders, the Apache virtual host configs and the MySQL dumps.

Here's the set up:

The cloud storage: Backblaze

Backblaze provides relatively cheap storage and 10gb free for new user accounts. That's actually enough for my needs for now but Backblaze is a big provider and I'd have no worries over sending money their way.

Anyway, the first task is to create an account and create a new bucket. Make a note of the private keys as they're needed for rclone setup below.

Install rclone

Install rclone as per the docs:

curl -O https://downloads.rclone.org/rclone-current-linux-amd64.zip
unzip rclone-current-linux-amd64.zip
cd rclone-*-linux-amd64

sudo cp rclone /usr/bin/
sudo chown root:root /usr/bin/rclone
sudo chmod 755 /usr/bin/rclone

sudo mkdir -p /usr/local/share/man/man1
sudo cp rclone.1 /usr/local/share/man/man1/
sudo mandb 

rclone config #create the config for backblaze

One extra step from me, the backup script will run as root to ensure there are no permissions issues, so we need to copy rclone config from user dir to root user dir (assuming you didn't sudo setup):

sudo cp /home/[YOU]/.config/rclone/rclone.conf /root/.config/rclone/rclone.conf

Apache and MySQL backup script

The Apache and MySQL backup script is a bash script. It does the following:

  • Backs up each database into its own dumped file, putting them in $backupdir/mysql
  • Backs up each website from /var/www into its own tarred file, putting it in $backupdir/www
  • Backs up all the virtual host configs, putting them in $backupdir/virtual_hosts
  • Tars all of the above into one final tar and encrypts it, leaving it in $backupdir

The config for the script is defined in variables at the top of the script.

Here's the script:

#!/bin/bash

dbuser="dbrootuser"
dbpassword="dbrootpass"
backuproot="/backups/incoming/"
backupdir="${backuproot}backups$(date +%Y%m%d)"
webdir="/var/www/"
virtualhostdir="/etc/apache2/sites-available/"

mkdir -p $backupdir/{mysql,www,virtual_hosts}

# MySQL backup
databases=`mysql -u $dbuser -p$dbpassword -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`

for db in $databases; do
    if [[ "$db" != "information_schema" ]] && [[ "$db" != "performance_schema" ]] && [[ "$db" != _* ]] ; then
        echo "Dumping database: $db"
        mysqldump -u $dbuser -p$dbpassword --databases $db > $backupdir/mysql/$db.sql
        gzip $backupdir/mysql/$db.sql
    fi
done

# Web files backup
webfiles=$(find "$webdir" -maxdepth 1 -type d)

for location in $webfiles; do
    if [[ $location != $webdir ]]; then
        site=${location#$webdir}
        echo "Backing up: $site"
        tar --exclude="$location/logs" -zcf $backupdir/www/${site}.tar.gz $location
    fi
done

#Virtual hosts backup
cp -r $virtualhostdir* $backupdir/virtual_hosts

#tar backup dir and remove it
tar --remove-files -zcf $backupdir.tar.gz $backupdir

#encrypt tar
gpg --encrypt --recipient 'Your encryption key' $backupdir.tar.gz
rm $backupdir.tar.gz

Now this script can be run as is and it would generate up to daily backups. But the real fun is in the backup rotation.

Backup rotation script

The backup rotation script is a separate script mainly based on this.

The backup rotation script is responsible for:

  • Running the backup script above
  • Creating the relevant destination_folder which is either monthly, weekly or daily, depending on the current day of the month and the day of the week
  • Emailing an error message if the source folder is empty
  • Moving the backup to its final destination away from incoming dir where it is currently (see config above)
  • Deleting old backups
  • Using rclone to sync the backups to Backblaze. The advantage of the sync command is that is mirrors local with remote, so remote will delete the old backups too!
#!/bin/bash

/path/to/backup/script.sh

backuproot="/backups"
source_folder=$backuproot/incoming

current_month_day=`date +"%d"`
current_week_day=`date +"%u"`

destination_folder=`date +"%d-%m-%Y"`

if [ -z "$(ls -A $source_folder)" ]; then
   echo "backup failed" | mail -s "backup failed" you@email.com
fi

#on first day of month
if [ "$current_month_day" -eq 1 ]; then
    final_destination=$backuproot/monthly/$destination_folder
else
    #saturdays
    if [ "$current_week_day" -eq 6 ]; then
        final_destination=$backuproot/weekly/$destination_folder
    else
        #any normal day
        final_destination=$backuproot/daily/$destination_folder
    fi
fi

mkdir -p $final_destination
mv -v $source_folder/* $final_destination

#keep daily 7 days
find $backuproot/daily/ -maxdepth 1 -mtime +14 -type d -exec rm -rv {} \;

#keep weekly 30 days
find $backuproot/weekly/ -maxdepth 1 -mtime +60 -type d -exec rm -rv {} \;

#keep monthly 200 days
find $backuproot/monthly/ -maxdepth 1 -mtime +300 -type d -exec rm -rv {} \;

rclone sync $backuproot backblaze:bucket

Cron it

As mentioned, we run the backup rotation script as root to avoid any permissions errors.

sudo su crontab -e

Add:

0 5 * * * /path/to/backup_rotation.sh

This will run the backup rotation script daily at 5AM.

Things for the future

There's a lot of additional things I'd like to add to this in the future, including:

  • Getting that DB password out of the script, probably using environmental variables
  • More failsafes/alerts for backup generation failure