My Setup & Prerequisites
I’ve got a LEMP stack, that is Linux (ubuntu), Nginx, MySQL, and PHP. It’s pretty common, but this should also work just fine for other LAMP stacks.
I want to backup some databases as well as web files, including some non-WordPress stuff.
There are tools out there like BackupBuddy that might be perfectly sufficient for your needs, especially if you want access to support forums and such.
But I like to dig into server-y stuff. Doing it this way will require a few things
- ssh access to your server
- sudo permissions (usually…)
- amazon s3 account
- some basic familiarity with the command line
Amazon S3
Since this setup is rather dependent on S3, make sure you go get signed up for that.
(There’s a free tier, but you’ll still need a valid credit card)
If you don’t have a bucket already, then go to your S3 admin page and create one. You may wish to note which region.
Similarly, if you don’t have a user, you’ll want to head over to the IAM admin and create one. Click Users in the sidebar, then the blue Create New Users button, enter a username, hit Create, and be sure to grab those credentials. While there I gave my user Administrator access from the permissions tab. You may wish to give the options a thorough review and choose something more restrictive that still gives you the necessary access (I’m going to go back and do this :))
Install the Amazon CLI tools
You’ll need to be SSH’d into your server for this part. I’m reiterating this guide.
First, check that you have python installed by running python --version
If not, please take a break from this post and get that installed.
Now let’s download the installer
wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip
Unzip it
unzip awscli-bundle.zip
And (if you don’t have sudo permissions, or don’t want to, see the guide linked above for alternative), run the install command
sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws
That should get you where you need to be. Test the command:
aws help
If you don’t get something that looks like help, you’ll want to retrace your steps, or perhaps go back to the official guide for an alternative install method.
Otherwise, now you can configure it. You’ll need the credentials (access key and secret access key) for the user that was created earlier
aws configure
That command will prompt you for 4 bits on info.
- access key
- secret access key
- default region (I went with us-west-2 since my bucket region is Oregon. Here’s a list)
- (optional) format. I entered json which is also the default if left blank. You can read more here
This will go into a config file in your current user’s home directory, ~/.aws/config
Note: If you’re going to be running the aws command as a different user, you’ll want to copy/create a config file for them too. More on that here
We should be all set for AWS!
To view your buckets, try this command
aws s3 ls
As usual, hit up the docs for more info or run aws s3 help
.
We’re going to use the sync
subcommand.
The Backups
There’s a lot of room for customization here. In this case, I just want a couple things to happen:
- back up the database(s)
- back up the web files (could be WordPress plugins, themes, uploads, and even non-WordPress stuff *gasp*)
- only maintain backups for the past few days
First, I need a place to keep the backups. For my server’s setup, /var/www/_backups/
makes sense.
Then, the commands for making the actual backups. You don’t need to run these just yet, though you can, just take note for now. (I borrowed a lot from this Lifehacker article)
For the database, mysqldump
THEDB="DATABASE-NAME"
THEDBUSER="DATABASE-USER"
THEDBPW="DATABASE-USER-PASSWORD"
THEDATE=`date +%d%m%y%H%M`
mysqldump -u $THEDBUSER -p${THEDBPW} $THEDB | gzip > /var/www/_backups/dbbackup_${THEDB}_${THEDATE}.bak.gz
That exports the given database, gzips it, and puts it in the backup directory.
For the web files, we can tar up the directory
THESITE= "SITE-NAME"
tar czf /var/www/_backups/sitebackup_${THESITE}_${THEDATE}.tar -C / var/www/$THESITE/html
This assumes your web files are in /var/www/domain.tld/html
so edit as needed for your setup.
To delete older backups, we can use find
plus the -exec
option
find /var/www/_backups/site* -mtime +5 -exec rm {} \;
find /var/www/_backups/db* -mtime +5 -exec rm {} \;
That says to find all the site* or db* prefixed files in /var/www/_backups
that were last modified more than 5 days ago and remove them.
Lastly, we need to sync these backups to S3
/usr/local/bin/aws s3 sync /var/www/_backups s3://BUCKET-NAME
I find using the full path to aws to be more helpful. sync
with no options will only add or update files, passing --delete
will remove files from the S3 bucket that are not in your backup directory. Add that if you’re concerned about using too much space.
Now, we can take all these commands and wrap them up in a single script. I created backup.sh
in my home directory and put it all together.
#!/bin/sh
THESITE= "SITE-NAME"
THEDB="DATABASE-NAME"
THEDBUSER="DATABASE-USER"
THEDBPW="DATABASE-USER-PASSWORD"
THEDATE=`date +%d%m%y%H%M`
# export database
mysqldump -u $THEDBUSER -p${THEDBPW} $THEDB | gzip > /var/www/_backups/dbbackup_${THEDB}_${THEDATE}.bak.gz
# export files
tar czf /var/www/_backups/sitebackup_${THESITE}_${THEDATE}.tar -C / var/www/$THESITE/html
# remove backups older than 5 days
find /var/www/_backups/site* -mtime +5 -exec rm {} \;
find /var/www/_backups/db* -mtime +5 -exec rm {} \;
# sync to amazon
/usr/local/bin/aws s3 sync /var/www/_backups s3://BUCKET-NAME --delete
Make that script executable with chmod +x backup.sh
and test it. If you’re in your home directory and that’s where you created the script, just run
./backup.sh
If all is well, add it to your crontab. crontab -e
and append something like
0 3 * * * /path/to/backup.sh > /dev/null 2>&1
That runs everyday at 3am (according to my server’s clock) and pipes any output into nothingland.
I set-up a similar script recently which makes backup copies to the root directory of my server and then I just have an automated process on my desktop PC that pulls down the files to my local computer for safe keeping.
This bash script is amazing:
https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
Rolled my own for the files backup portion.
Nice. My next step is to create those rotating backups, so that’s a really handy link 🙂
Nice guide. It has come in handy as a good addition/redundancy to my daily backup cron.
Also, S3 bucket region info for AWS config can be found from the ‘All buckets’ dashboard by right clicking on bucket name and choosing ‘Properties’. It will show lot of useful stuff like below with user permissions as well.
——————————————————————
Bucket:
Region:
Creation Date:
Owner:
Short & Simple. Thanks for this 🙂
Fantastic.
I´ve tried several solutions to backup my proxmox server but this is awesome.
Great tutorial.
Is there away directly we can write to S3 rather than sync to S3 from disk.
… |gzip > aws s3 cp – s3:/s3bucket/dbbackup_${THEDB}_${THEDATE}.bak.gz
Awesome work Kailey Lampert . SuperLike for this 🙂
Hello,
i have exact requirement to upload backup on amazon s3 and this helped me to run aws cli command in scheduled task.
Thanks for sharing it.
Thanks for this, it was a big help.
Thank you very much ! You have cleared out the difference between them.
thanks you very much ! it’s working as expected