Technically, both of you are correct (Tim and Pascal).
The command I gave Tim was “./backup.pex --domain=domain.tld --siteworx” This pex file does a few things, one of which sets the php config file. What we had in mind was for people to run the backup.pex file in a cronjob, but if you wanted to show all the details, Pascal’s command would work too.
Be carefull using Webmin on an InterWorx server, it’s easy to break something. Webmin’s backup may back up the basics of what the server needs to get it running, but InterWorx will not recognize it if you have to restore from it. InterWorx stores info about each SiteWorx account in a database which Webmin would not know to back up.
After I took a look at Webmin, Tim is correct. There are many settings in the Interworx internal DB’s that Webmin may not backup. Even if you did tell Webmin to backup the Interworx db’s, it wouldn’t be as simple as restoring them. The current siteworx backups contain an ini file that store account related information to direct the restore process. Without this information, it would be a very long and tedious manual restore process, even if you had all the right data.
We’ve recently come up with a good solution to automating SiteWorx backups, and this feature will be added very soon.
I realize now that backup.pex needs to run as the iworx user. A detail I forgot to consider, which prevents SiteWorx users from adding it to their crontab. But, as I said in my previous post, we’ve come up with a solution and will be coding it soon.
This backs up everything. All the logs, all the configurations, all the email, site data. Because it’s rsync, it only copies the files which are new or changed. So I’m not doing huge copies every night, just “incremental” (okay, pseudo-incremental).
I am currently doing something similar. My Rsync is a little different as I include the --delete which removes files from the backup which are no longer on the disc, but I do keep 10 days worth of seperate backups.
Here is one of the lines from my script:
rsync -ax --delete /home/ /backup/0/home
All of this is great if you need to recover a file here or database there, but if you have a dead hard drive you will have to recreate all the siteworx accounts again (emails, etc.) and then copy the data back.
I basically want to do both, have the incremental file backup and also have SiteWorx backups. I could do the SiteWorx once a week and would be used to restore the accounts since once they are setup they shouldn’t change too much. Then I’ll have the daily file backup to restore files to the latest ones if needed.
Then I also want to create another cron to SSH-rsync the files to a remote linux box for offsite storage.
My question is still using the command line InterWorx backup feature, is it possible to set the backup destination directory?
I would try to figure it out myself, but the backup.php (which is run by backup.pex) is protected (as it should be ) so I don’t know what the other options are. Maybe you guys (Chris, Paul) could add a --help to some of the command line PEX (PHP) files to help us out
Now that I thought about it more, the rsync may be enough.
If you get the /home directory this include the /home/interworx directory which contains all the InterWorx setup info (database, config files, etc). Now this wouldn’t be very useful if you need to restore a single account it should work in the “doom’s day” situation.
I am really just guessing here so I would like to here from someone from InterWorx, but it does make sense (although I’m sure I have missed something)
Thanks everyone, this is exactly the type of info I need. I can wait on a scheduled backup, I manage everything on my sites from top down and am not available to log into the system each and every day to backup via a mouseclick, thus the need for a scheduled one. I ended up using webmin to back up the data because ensim’s backup program failed most of the time. In addition they write out a file for each backup each day which meant that pretty soon if I did not clean them up very often I had hundreds of files out there pretty quickly.
Thus any solution that avoids that would be nice. Of course any solution that makes work avoidance by me easier is always nice, but I digress.
I’m just beginning the migration process, sort of stalling to see if an ensim script finds its way into my beta testing hands providing for a means for that work avoidance thingy.
If that’s the case, I could use webmin to just backup everything at /home and use webmin to restore a single site if needed by restoring /home/sitename.
I’m guessing that a problem might arise if something about the config of the site that interworx keeps up with changes. In addition, what about sql databases, etc? Should I backup /home/interworx a few times a day for safekeeping?
I just tested that with webmin and it only dumped 129 meg and took less than 15 seconds, of course this is on an empty system with only one site on it right now, but taking a backup like that 4 times a day is certainly in the picture.
I’ve never used webmin, so I dont know how that works. But InterWorx keeps a lot of information in its own Mysql database (and instance of MySQL seperate from where all the SiteWorx user database are stored).
What I was thinking is if you backup the the entire /home directory (which is something every backup should do) you will get a copy of the InterWorx config and database which may be able to restore the SiteWorx accounts to a new hard drive. This would NOT give you the ability to restore one though, it would be all or none kind of thing.
And even then, everything I’ve said above may not even work. I have never tried that so I can not say if that would work or not, just trying to throw out ideas
webmin will backup the directory /home to a specified file name. You can specify level of dumps with 0 being a full backup, 1 new or changed since a previous dump level etc, all the way up to 9. This allows for incremental backups at any interval really.
I take two backups to two different drives, a full backup once a week and a level one dump daily at midnight, all to the same file name, meaning that there is always only one backup file with a mirror on another drive. When you want to restore you just specify the backup filename, the directory or file you want to restore and it takes care of the rest.
Thus I was thinking that if I backed up the /home directory each night, this would capture everything for the interworx and all of the sites, and if I backed up the /home/interworx directory say 4 times a day I could keep up with any changes to interworx itself to within 6 hours.
I’m wondering if it’ll work also, one of the important considerations is sql data…
It will not work, you will have your basic web and email files, but you will miss your database files located here /var/lib/mysql and your httpd configuration files located here /etc/httpd and below (you need EVERYTHING IN /etc/httpd/conf.d/ ).
In addition you may have file permission problems during restore. InterWorx has very specific file permissions that it needs in order to work. Certain files need to be owned by the individual siteworx users, some by iworx, and some by root (and possibly httpd, I don’t recall exactly).-- I know I’ve screwed them up before )