[GET] The Best Cloud Auto-Backup Solution For Interworx

I’ll make this as simple as possible. If you want the most secure and reliable way of backing up your accounts and Interworx configs, here it is…

Step 1) Sign up for SpiderOak (It’s a cloud backup solution). Their zero-knowledge encryption model is awesome and so are their software and prices. I think they even offer a few free gigs. I have a TB and it’s super cheap.

Step 2) Go to the SpiderOak download page and grab their Fedora RPM for your architecture (It will work with CentOS and such). Upload it to your server. Don’t run rpm from the command line using their download URL. You’ll wind up with i386 (Which is fine if you’re on a 32bit OS but whatever just upload it).

Step 3) In SSH as root, navigate to where you uploaded the SpiderOak rpm and run “rpm -Uvh SpiderOak-x.x.x_x.x.rpm” without the quotes and change that filename to whatever your SpiderOak rpm’s filename is.

Step 4) Run “SpiderOak --setup=-” (Without quotes) and follow the instructions to add your login and set your device name.

Step 5) Run the following commands:

mkdir /backups
mkdir /backups/daily
mkdir /backups/weekly
mkdir /backups/monthly
chown -R iworx:iworx /backups
su iworx
nano /backups/RemoveOldBackups.sh

Step 6) Paste the following code into nano then hit ctrl+x and agree to save the file:

#!/bin/sh
# Remove old backups

find /backups/daily -name "*.tgz" -mtime +0 -type f -delete
find /backups/weekly -name "*.tgz" -mtime +6 -type f -delete
find /backups/monthly -name "*.tgz" -mtime +13 -type f -delete

Step 7) Execute commands…

chmod +x /backups/RemoveOldBackups.sh
exit
SpiderOak -v --include-dir=/backups
SpiderOak -v --include-dir=/home/interworx/var/backups/iworxdb

Step 8) Login to Interworx, go to the “Cron” section and add the following cron jobs under the iworx account (The iworx account is the one that’s already selected when you get there)…

Each day at midnight (0): /home/interworx/bin/backup.pex -b all -o /backups/daily --domains all --email=your@email.com --compression=9 --quiet

Each day at 4am (4): sh /backups/RemoveOldBackups.sh > /dev/null 2>&1

On the first day of each week: /home/interworx/bin/backup.pex -b all -o /backups/weekly --domains all --email=your@email.com --compression=9 --quiet

On the first day of each month: /home/interworx/bin/backup.pex -b all -o /backups/monthly --domains all --email=your@email.com --compression=9 --quiet

Step 9) Now change to the cron jobs for the root account (The dropdown box at the top) and add the following cron jobs…

Each day at 6am (6): SpiderOak --batchmode
Each day at 11am (11): SpiderOak --purge-deleted-items=14
Each day at noon (12): SpiderOak --purge-historical-versions

Note:
You can change 14 to whatever you want. SpiderOak will retain deleted backups in the cloud. Since you’ve already got a full month back you don’t really need those files and they take up a lot of space. You could just set it to 1 really. Up to you. The number indicates the age of the files. Think of them as remaining in a trash can in the cloud even after you delete them from the server for X amount of days specified in that cron.

Step 10) Profit!

You can tweak those settings to whatever you like but for me this is what’s good. You can also add other directories you might want to backup and there’s a directory in /root called SpiderOak Hive that will backup anything you put into it. I’ve been using SpiderOak to backup my servers and desktops for years and it’s never failed me. It’s also great that the SpiderOak company can’t see any of your data. I hope you found this useful. Don’t let a data disaster happen to you. I’ve been there. It sucks.

I have modified the shell script to behave as it should and be a bit more strict as to its file searching (Not that it takes input but still). At first I didn’t realize that the behavior of mtime +1 would be that it would only look past 2 days and not 1 as expected. Now it will delete all backups older than a day/week/month and all is now performed without the use of “rm.” Of course you will still retain 14 days of daily backups with this setup. They just won’t be hanging around on your server, they’ll be in the cloud where they belong. The idea here is to keep a decent archive of backups in the cloud but only 1 backup for each time period ever on the server at any given time (Save up to 4 hours until the removal script is executed).

If you ran the previous script it didn’t hurt anything. It would have just removed the files a day or so late.

Great tutorial! Will try it when i have the time.

Hi synthetisoft

Plus 1 for excellent tutorial. Kudos to you and thanks for sharing

I think your brave stating the best :slight_smile: it’s opinionated

It might be more useful if there were an interworx plugin so siteworx users could restore themselves

It’s just a thought though

Many thanks

John

That will be awesome!

[QUOTE=d2d4j;28952]Hi synthetisoft

Plus 1 for excellent tutorial. Kudos to you and thanks for sharing

I think your brave stating the best :slight_smile: it’s opinionated

It might be more useful if there were an interworx plugin so siteworx users could restore themselves

It’s just a thought though

Many thanks

John[/QUOTE]

Of course “Best” is subjective except that I do think using a private key encrypted cloud backup solution is the best practice. I’m not sure how many of those are out there but I know most cloud backups don’t behave that way. As for restoration, it’s a simple SpiderOak command to restore the backups to the same or new server and then you do the import; however, I have already started coding a SpiderOak plugin that will automate everything :slight_smile:

edit: Oh you want me to integrate it with siteworx. I don’t see why not. It will be a paid plugin but super cheap. Maybe I’ll integrate other cloud backup services into it as well.

Looking forward to buy it! :wink:

vo2sz.jpg

Hi synthetisoft

Haha, are you a Simpson fan

Just a quick question if you don’t mind

Are these full backups

Do the backup routine cause excess loading on server (some of our client sites are in excess of 180gb, so the IW backup is not very useful on those sizes)

Can the backup/restore replace only a single file and/or on MySQL, can a table or record be restored. Or is it a full restore as in import of a backup

Many thanks

John

[QUOTE=Synthetisoft;28959]…

[/QUOTE]

Don’t forget “It will be a paid plugin but super cheap”. Mr. Burns must keep is word :wink:

[QUOTE=d2d4j;28962]Hi synthetisoft

Haha, are you a Simpson fan

Just a quick question if you don’t mind

Are these full backups

Do the backup routine cause excess loading on server (some of our client sites are in excess of 180gb, so the IW backup is not very useful on those sizes)

Can the backup/restore replace only a single file and/or on MySQL, can a table or record be restored. Or is it a full restore as in import of a backup

Many thanks

John[/QUOTE]

It’s a full backup and as you can see the compression level is set to its highest which will eat up some CPU. The reason it doesn’t matter much for me (And I have some very large accounts) is that my dedi has a xeon of 16 cores and 24 gigs of ram. My backup completes in probably something around 20 minutes. If you adjust the compression level, the backup will complete faster on slower machines. Of course there’s a range of different backups levels as well but full backups are what I wanted. I’ll make all of these variables mutable in the plugin.