Remote backup tool READY!

This:

Should be:

echo "1 -> 2"
rsync -a --delete /backup/1/* /backup/2
 
echo "0 -> 1"
rsync -a --delete /backup/0/* /backup/1
 
rsync --rLptgoD --delete source /backup/0

Right?

I read in this post: http://interworx.com/forums/showthread.php?t=1047&highlight=backup

That the structure backup does not backup databases.
You need to use the --database option to only backup databases:
http://interworx.com/forums/showthread.php?t=1405&highlight=backup+database

So I have to write a script that first remotly rsyncs /home, then does the structure backup and then the database backup. It’s going to be fun :wink:

Too bad that this structure backup is not available at the moment :frowning:

Thats right, thanx for pointing that out!

Right it does not backup user database, just SiteWorx account info (Iworx database).
But what I do now (and will do with struct backup) is to backup the /var/mysql directory.
So I can restore all the database info by copying back the DB files back into /var/mysql.

Now the downside to that is if you have a crash and move to a new system with a different version of MySQL you may not be able to open the /var/mysql files in the new version.
I’m not sure how Iworx backups the database, but if its more of an export of all the commands (dump or whatever) to rebuild the database that would be better for moving from one verion to a newer one.
Iworx guys can you confirm how the Iworx user database backup works?

But worse case secnerio using /var/mysql is you just load them on a a testbed with same version and dump them to text files, then use those to import them to new system, most important thing is to have the data backed up in some form :slight_smile:

Yeah, but my guess is that it will be out in the next 4-6 weeks just based on my own speculation from things I’ve picked up around the forum.

Well as far as for the DB backup… I could use mysqldump to just dump all the DBs as SQL files.

Or I can download sysbackup from R-F-X Networks and let that handle the database backup. (That’s what I do with our Ensim servers).

Maybe I should just try the database only backup from Iworx to figure out what kind of backup files we will get :rolleyes:

So I’ll be all set when that structure backup is out :cool:

One more question.

I’m getting these errors when I rsync:

symlink has no referent: /home/siteuser/something

And things like this

rsync: readlink "/home/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/interworx/cron/rrd.pex" failed: Too many levels of symbolic links (40)

This is the command I use:
rsync -rLptgoDv -e “ssh -i /root/.ssh/backup-rsync-key” /home/* backup@10.10.10.1:/backup/rsync/1/cpserver/home

Don’t know if that’s a problem.

What do you do when you have to restore an account?

  1. Create the account?
  2. Create DB users?
  3. Import DBs?
  4. Create mailboxes?
  5. Rsync the data back?

And then for 5, do you Rsync it back like this:

rsync -rLptgoDv -e “ssh -i /root/.ssh/backup-rsync-key” backup@10.10.10.1:/backup/rsync/1/cpserver/home/siteuser/* /home/siteuser/

?

About my backup i want inform i never add oprion for rsync,… this is a full backup system.
After years working with hosting i see the best way to make backup its just make 100% full backup.

I have powerfull servers, and usually take just 2hs for 250 sites,with load about 1.8
That its low load.
I run mabkups in the midlle of the night, the load created from the backup… little extra load …its compesated with low activity on server.

Why i should use rsync whe i have:

FASTER CPU SERVERS
BIG HD ON SERVERS
NO METERD NETWORK

This 3 points , makeme work with full backups.

WebXtrA, please explain waht feature you need, and i try add.

  • Unnecessary wear and tear on hardware
  • Extremely inefficient

Having a big hard drive makes not difference, actually since you are TGZ’n you would need a smaller drive. The networkk only matters if you are using a remote backup and wether or not you have un metered 100 mbit/sec backup is pretty slow and you would want to transfer the least amount of bits as possible.

Of course every person does what the feel is best for them so whats good for me might not be good for you.

This happens if you have a symlink that points no where. SSH and go to the “something” and this is most likely the problem. It’s basically like having a windows shortcut to nowhere.

Never seen this one before, but maybe you have a symlink pointing back to itself? So it gets caught in a loop?

I never had to do a complete restore on an account using an Rsync backup, but your list looks right. You would basically have to restore all the SiteWorx parts of the account before restoring the data.

The only thing Im not 100% sure how I would handle is symlinks. I use the big L because I want to make sure I have the actual data if the symlink is pointing to a directory on the hard drive Im not backing up. I’m not exactly sure how rsync handles this, but my main concern is having the data, everything else can be reconfigured/rebuilt (at the cost of time). So when you restore you will be restoring the actual file so you may have to manually move that file to its original loaction and re-symlink it. I’ll have to do some test and see how rsync handles that.

Justec, sorry but 100% full backups, in my case its the best.
Backup and Retsore are 100% fiable and automatic.
My web hosting company its 100% automatized, included orders, activations, payments, suspend account, delete account, enable accounts, upgrade accounts etc etc.

Time ago i use rsync backup, but i need check all time when sites are automatic deleted form servers (no pay customers), for delete that sites of the backups!! and belivme when you have high number of servers with 250 sites per server, this its really big nightmare chek all days what sites should be deleted, in the rsync backup.

I have multiple servers, i not host 100 or 200 sites on one server…i have lot of servers with 250 sites per server.

Sorry bur on my case, the best backup its 100% backup.

When runing backup, on rare situations cpu load go to 1.8 usually its 0.9 to 1.2 when run backup, and no create any torubles, i use this backup schema for the last 6 months on interworx with 0 trouble.
And on ensim servers in the last 4 years.

In some situations i think rsync can be used,specially when volume data to backup its very very high…
For me its perfect : )

For people with not faster servers, bw restricted, low space for storage, and when need save 7 days backup or more…yes, this backup way its not usefull and rsync its better.

Best regards.

Hi Dj-Grobe,

Thanks for your reply.

Don’t get me wrong, I prefer a full native Iworx backup above the solution that I’m using now.

We got an Interworx cluster with only AMD Optron 165 and dual Xeon 2,8 Ghz or better systems with at least 2Gbyte Ram.

We have to create daily external backups (on a different server than the servers where the sites are hosted on) due to a certain certification of our company.

With full backup it takes about 10-12 hours to backup 500 sites (50Gbyte or more) to the external server. It takes so long in my opinion do to the fact that the backups have to be tgz’ed.

So what I would like to do is to use all the servers within the cluster to make backups (like 200 sites per server), so that they all do a bit.

I tried to do this with the native interworx backup.pex solution, but ran into the problem that it doesn’t make the backups because it can not connect to the database (it tries to connect to localhost but it should connect to the Cluster Manager or dedicated database server) and therefor doesn’t backup anything and gives an error.

I would be greatefull if you could implement a solution so that I can created backups on the nodes.

Thanks,

R?mon

I understand … and like i say…in your case the solution its ONLY RSYNC.
That i say on my post, rsync its the only solution when have big size backup.

Full backup its ok just for normal host services, were you usually have just a no more of 5gb in backups for complete server.

You try open ticket to iworx for this?

The trouble its fault on native backup feature.
I think iworx guys should fix this.

Best regards.

I opened a ticket for this, but I’ll have to wait for the structure-only backup.

What I do now is:

  1. Rsync /home to an external backup server
  2. Create database backups with the native Iworx backup
  3. Create Mail backups with the native Iworx backup
  4. Created a Rsync backup rotation script on the backup server, so I can go back multiple days.

When I need to restore an account I do it like this:

  1. Create the account.
  2. Import the database and mail backup.
  3. rsync everything in the html dir.
  4. chown everything in “html” to the siteworx user.
  5. Verify with the customer which subdomain he/she had and recreate them.
  6. Verify with the customer what kind of crons he/she had and recreate them. (don’t know if structure-only makes a backup of this)
  7. Change permissions for dir’s and files if needed.
  8. Recreate forwarders, mail groups and aliasses (don’t know if structure-only makes a backup of this)

Steps 1 and 5 (and 6 and 8) will be skipped when the structure-only backup is there.

EDIT: Forgot some things.

The iworx user does not own the contents of the html directory, the siteworx account’s unique system user does (first 8 characters of domain name) [/quote]

That’s what I meant :rolleyes: , changed it in my post now. Thanks for correcting.:slight_smile:

By the way, regarding:

  1. Verify with the customer what kind of crons he/she had and recreate them. (don’t know if structure-only makes a backup of this)
  2. Recreate forwarders, mail groups and aliasses (don’t know if structure-only makes a backup of this)

Does the structure only backup makes a backup of this?

This cronjob doesnt seem to work:

root /root/.bk/remotebackup > /dev/null/ 2>&1

What is this portion for:
> /dev/null/ 2>&1

and why root before the path? Doesn’t cron normally run as root?

Works just doing /root/.bk/remotebackup

Thanks,

If work using /root/.bk/remotebackup , you cna use on that way.
Sorry for later repply.

That doesnt seem to work either.

Why doesnt the cron you suggest work? Is it because /dev/null isnt setup?

I’ve been having to run these backups manually.

I would love to use this script unfortunatly I dont use default ssh ports on my servers. Any chance you will be adding the ability to add a port anytime soon?

Hello ctalavera,

You can already open and close ports in the firewall link in nodeworx.

Actually I meant setting a port in the script to connect to. Right now you can only set a IP address. So it assumes you are connecting to port 22 on the server you are backing up to. I use different ports on my server for ssh.

If you are asking if the new version of Iworx will be able to change the SSH port to something other than 22 then the answer is no. HOWEVER there are no problems with changing the port manually as long as you open the new port in the firewall and the ports listed in the main NodeWorx screen are hard coded so they won’t be updated.

We may at some point make it possible to change default ports on services through the software itself but would need to write it so that the firewall was automatically updated as well to keep people from being locked out of their boxes.

EDIT: nevermind, I guess you weren’t asking this at all.