Setting Up Automated Backups for WordPress and Others to S3 with Serverpilot

Serverpilot is an automated server building solution, similar to forge from the creators of laravel.  These are great solutions for those of us who want to focus on building applications and programming and not on server maintenance.  While the server setup is not difficult, particularly for one or a few sites, if you have many sites on your server and many development environments with multiple ssl certificates, etc. these tools can save a lot of time and headaches.

While this is written for serverpilot, the majority of it is applicable regardless of what solution you use to manage your server, even if you set them up yourself.  Serverpilot will also run on vultr, digital ocean, or other vps providers, so there might be some other tailoring as well. Additionally, while serverpilot is often used with WordPress sites, this script will backup all databases and all files so it will work for drupal, processwire, anything you have on your server.

While serverpilot handles many aspects of managing your web sites, it doesn’t provide any backup services.  And while, vultr and digital ocean, for example, offer server wide backups these are impractical for many usages, particularly on servers with multiple sites for different clients.

On to the steps:

  1. ssh into your server
    ssh username@servername.com
  2. su to root on your server and cd /tmp
    su -root
  3. Now download and setup the command line tools for amazon web services (AWS)
    curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
    unzip awscli-bundle.zip ** note that on serverpilot administered servers unzip is not available by default so you will need to apt-get install unzip
    ./awscli-bundle/install -b ~/bin/aws
  4. Now you need to configure with your keys
    aws configure ** if you don’t know how to get your keys, read this.
  5. Now you should be able to do a command like aws s3 ls (dir list of s3 buckets), if you cannot, you need to give the user associated with the keys you entered in the configuration appropriate s3 permissions.

Now we are going to setup the database backup script.  In serverpilot by default you have .my.cnf file which contains your root mysql password.  This means that commands like mysqldump do not require a password if run by this user.  You can create this file manually, or better, use mysql_config_editor (5.6+) to create mylogin.cnf which hashes the password.

  1. cd ~
  2. touch sitesbackup.sh && chmod 700 sitesbackup.sh && nano sitesbackup.sh
  3. Enter the following into this file (modify as needed).  For the most part this would simple be changing the “ExcludeDatabases” line to contain any other databases you don’t want to backup.  If you only have one database you can drop all of that and just use the mysqldump command for that one database.  This script assumes you have multiple sites setup, otherwise you probably wouldn’t be using serverpilot anyhow…
    #!/bin/bash
    ExcludeDatabases="Database|information_schema|performance_schema|mysql"
    databases=`mysql -u mysqladminuser -e "SHOW DATABASES;" | tr -d "| " | egrep -v $ExcludeDatabases`
    for db in $databases; do
    echo "Dumping database: $db"
    mysqldump --add-drop-table -u mysqladminuser $db | bzip2 -c > /whereyouwantthem/backups/`date +%Y%m%d`.$db.sql.bz2
    done

Now we will open the script and clean it up a bit and add our tar backups and then copy everything off to s3.  So nano sitesbackup.sh and the entire script is below:


  1. #!/bin/bash
    ExcludeDatabases="Database|information_schema|performance_schema|mysql"
    databases=`/usr/bin/mysql -u mysqladminuser -e "SHOW DATABASES;" | tr -d "| " | /bin/egrep -v $ExcludeDatabases`
    BackUpDir="/whereyouwantthem/backups"
    TodaysDate=`date +%Y%m%d`#Setup or Clear Backups Directory
    if [ -d "$BackUpDir" ]; then
    rm $BackUpDir/*.bz2
    fiif [ ! -d "$BackUpDir" ]; then
    mkdir $BackUpDir
    fi#Dump all the Databases into Backup Directory
    for db in $databases; do
    echo "Dumping database: $db"
    /usr/bin/mysqldump --add-drop-table -u mysqladminuser $db | bzip2 -c > $BackUpDir/$TodaysDate.$db.sql.bz2
    doneExcludeDirectories="--exclude=/directory/to/exclude/*.jpg --exclude=/directory/to/exclude/logs"

    # This will copy all serverpilot user accounts and all of their files for a complete backup

    tar -cvpjf $BackUpDir/$TodaysDate.allsites.tar.bz2 $ExcludeDirectories /srv/users/

    #Copy Everything in the Directory to S3 for offsite backup
    #Occasionally you should delete some of these on aws to save space
    /usr/local/bin/aws s3 cp $BackUpDir/ s3://bucketname/$TodaysDate --recursive --include "*.bz2" --region your-region-1

  2. Then add this to the root (or admin user) crontab, crontab -e, to run, perhaps weekly on saturdays at 2am – 0 2 * * sun /root/sitesbackup.sh

That’s about it, obviously you should tailor this for you own needs and also clean up your aws buckets from time to time so that you are not paying for storage you don’t need.  I hope this is helpful for others.

You may also like...