Automating Backups to Amazon S3

My Setup & Prerequisites

I’ve got a LEMP stack, that is Linux (ubuntu), Nginx, MySQL, and PHP. It’s pretty common, but this should also work just fine for other LAMP stacks.

I want to backup some databases as well as web files, including some non-WordPress stuff.

There are tools out there like BackupBuddy that might be perfectly sufficient for your needs, especially if you want access to support forums and such.

But I like to dig into server-y stuff. Doing it this way will require a few things

  • ssh access to your server
  • sudo permissions (usually…)
  • amazon s3 account
  • some basic familiarity with the command line

Amazon S3

Since this setup is rather dependent on S3, make sure you go get signed up for that.
(There’s a free tier, but you’ll still need a valid credit card)

If you don’t have a bucket already, then go to your S3 admin page and create one. You may wish to note which region.

Similarly, if you don’t have a user, you’ll want to head over to the IAM admin and create one. Click Users in the sidebar, then the blue Create New Users button, enter a username, hit Create, and be sure to grab those credentials. While there I gave my user Administrator access from the permissions tab. You may wish to give the options a thorough review and choose something more restrictive that still gives you the necessary access (I’m going to go back and do this :))

Install the Amazon CLI tools

You’ll need to be SSH’d into your server for this part. I’m reiterating this guide.

First, check that you have python installed by running python --version
If not, please take a break from this post and get that installed.

Now let’s download the installer

wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip

Unzip it

unzip awscli-bundle.zip

And (if you don’t have sudo permissions, or don’t want to, see the guide linked above for alternative), run the install command

sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws

That should get you where you need to be. Test the command:

aws help

If you don’t get something that looks like help, you’ll want to retrace your steps, or perhaps go back to the official guide for an alternative install method.

Otherwise, now you can configure it. You’ll need the credentials (access key and secret access key) for the user that was created earlier

aws configure

That command will prompt you for 4 bits on info.

  1. access key
  2. secret access key
  3. default region (I went with us-west-2 since my bucket region is Oregon. Here’s a list)
  4. (optional) format. I entered json which is also the default if left blank. You can read more here

This will go into a config file in your current user’s home directory, ~/.aws/config

Note: If you’re going to be running the aws command as a different user, you’ll want to copy/create a config file for them too. More on that here

We should be all set for AWS!

To view your buckets, try this command

aws s3 ls

As usual, hit up the docs for more info or run aws s3 help.

We’re going to use the sync subcommand.

The Backups

There’s a lot of room for customization here. In this case, I just want a couple things to happen:

  • back up the database(s)
  • back up the web files (could be WordPress plugins, themes, uploads, and even non-WordPress stuff *gasp*)
  • only maintain backups for the past few days

First, I need a place to keep the backups. For my server’s setup, /var/www/_backups/ makes sense.

Then, the commands for making the actual backups. You don’t need to run these just yet, though you can, just take note for now. (I borrowed a lot from this Lifehacker article)

For the database, mysqldump

THEDB="DATABASE-NAME"
THEDBUSER="DATABASE-USER"
THEDBPW="DATABASE-USER-PASSWORD"
THEDATE=`date +%d%m%y%H%M`

mysqldump -u $THEDBUSER -p${THEDBPW} $THEDB | gzip > /var/www/_backups/dbbackup_${THEDB}_${THEDATE}.bak.gz

That exports the given database, gzips it, and puts it in the backup directory.

For the web files, we can tar up the directory

THESITE= "SITE-NAME"
tar czf /var/www/_backups/sitebackup_${THESITE}_${THEDATE}.tar -C / var/www/$THESITE/html

This assumes your web files are in /var/www/domain.tld/html so edit as needed for your setup.

To delete older backups, we can use find plus the -exec option

find /var/www/_backups/site* -mtime +5 -exec rm {} \;
find /var/www/_backups/db* -mtime +5 -exec rm {} \;

That says to find all the site* or db* prefixed files in /var/www/_backups that were last modified more than 5 days ago and remove them.

Lastly, we need to sync these backups to S3

/usr/local/bin/aws s3 sync /var/www/_backups s3://BUCKET-NAME

I find using the full path to aws to be more helpful. sync with no options will only add or update files, passing --delete will remove files from the S3 bucket that are not in your backup directory. Add that if you’re concerned about using too much space.

Now, we can take all these commands and wrap them up in a single script. I created backup.sh in my home directory and put it all together.

#!/bin/sh

THESITE= "SITE-NAME"
THEDB="DATABASE-NAME"
THEDBUSER="DATABASE-USER"
THEDBPW="DATABASE-USER-PASSWORD"
THEDATE=`date +%d%m%y%H%M`

# export database
mysqldump -u $THEDBUSER -p${THEDBPW} $THEDB | gzip > /var/www/_backups/dbbackup_${THEDB}_${THEDATE}.bak.gz

# export files
tar czf /var/www/_backups/sitebackup_${THESITE}_${THEDATE}.tar -C / var/www/$THESITE/html

# remove backups older than 5 days
find /var/www/_backups/site* -mtime +5 -exec rm {} \;
find /var/www/_backups/db* -mtime +5 -exec rm {} \;

# sync to amazon
/usr/local/bin/aws s3 sync /var/www/_backups s3://BUCKET-NAME --delete

Make that script executable with chmod +x backup.sh and test it. If you’re in your home directory and that’s where you created the script, just run

./backup.sh

If all is well, add it to your crontab. crontab -e and append something like

0 3 * * * /path/to/backup.sh > /dev/null 2>&1

That runs everyday at 3am (according to my server’s clock) and pipes any output into nothingland.

13 thoughts on “Automating Backups to Amazon S3”

  1. Nice guide. It has come in handy as a good addition/redundancy to my daily backup cron.

    Also, S3 bucket region info for AWS config can be found from the ‘All buckets’ dashboard by right clicking on bucket name and choosing ‘Properties’. It will show lot of useful stuff like below with user permissions as well.
    ——————————————————————
    Bucket:
    Region:
    Creation Date:
    Owner:

  2. Hello,

    i have exact requirement to upload backup on amazon s3 and this helped me to run aws cli command in scheduled task.

    Thanks for sharing it.

Leave a Reply to Russell Heimlich Cancel reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: