Howto: Setup S3 Backups

[SIZE=“1”]Install the S3Tools[/SIZE]
Choose your repository that you need, I’m going with RHEL6 seeing as that’s what’s running under the hood, Check the List Here

cd /etc/yum.repos.d
wget <-Insert your REPO
yum install s3cmd

After that you’ll need to run the bouncing ball config here. Make sure you have your Amazon API and Secret Key handy

s3cmd --configure

Once configured we need a server backup script … I put mine in


echo $(date) >> $LOGFILE
/home/interworx/bin/backup.pex --domains all --filename-format %Y-%m-%D --quiet --output-dir /var/backup
s3cmd sync /var/backup s3://sitebackup/ >> $LOGFILE

So what that does it dumps all the site backups to /var/backup and then syncs it with the S3 folder of your choosing… You don’t need to use the whole s3 name just your bucket name will do the s3 tools translates.

So now you need to

chmod +x /usr/sbin/FullBackuptoS3

Then create a CRON Job in the gui or cmdline
I set mine up in the GUI and simply wrote

as the command line and it’s working great
If you need to brush up on your Cron, Here’s the Documentation

Very cool Ryan, thanks for posting this tutorial!

Building on Ryan’s tutorial, I’ve created a bash script to reduce the overhead of creating new backups if you already run regular local backups - it does the following:

  • Searches for all iworx-backup folders (if it doesn't already find the list - if you add new backup runs, then simply delete the file it generates, and next time it runs, it'll recreate it.)
  • copies the latest backup from each of the folders to a tmp folder (backup modified less than 24 hrs ago)
  • syncs to S3 bucket
  • cleans up temp files
  • email log to admin

It works well for me, sharing in case someone else finds it useful (it’s attached as the forum won’t allow me to post the whole script).

BackupToS3.txt (1.45 KB)