Batch job to backup siteworx account on a distant server

Hello,

I share with you what I’ve setup for the backups of all siteworx accounts.

I have a NAS account with SAGO to store all my backups.

So the idea is :

  • to store in this space (distant server) all siteworx backup files.
  • to create a cronjob and use the iworx backup command
  • to have only the last backup files. Not all previous backuped files.

The problem is that you can’t acces with SSH to this NAS space.
So you can’t create a script to delete all previous backup files.

The solution I found is to use SFTP to run some commands as “rm”.

So we first connect to this distant server with SFTP to delete all previous backup files, then use the iworx backup command with the scp option to transfer all the new backup files to the distant server.

One time again there is a pbm.
SFTP can use a file to run commands in batch mode but you can’t give it a clear password.
You have to use public/private key to login to the distant server.

So, we will have to :
1- Create dsa key
2- Move the public key to the distant server
3- Create a file with all necessary SFTP command to be used in batch mode
4- Create the transfer.ini file for the iworx backup command
5- Create the final script that call the SFTP and iworx backup commands
6- Create the cronjob

1- Create the DSA KEY

Login on root on SSH on your system.

mkdir /root/key
chmod 700 /root/key
ssh-keygen -t dsa -b 2048 -f /root/key/host

You can replace host with your actual hostname (of your box).
When it asked you for the challenge phrase, let it empty and press enter twice

2- Copy the public key on the distant server

On your local server, do :

chmod 600 /root/key/*
cd /root/key
scp -P 22 host.pub user@hostname.tld:~

You may replace “host” with the hostname of your box. It has to be the same than 1-

Now we have to create a directory on the distant server.

Login using your favourite FTP client.
Make sure you set it such that you can view hidden files.

  • Create a directory .ssh
  • Move the host.pub into the newly created directory
  • Rename “host.pub” to “authorized_keys”
  • set it to mode 600.

If your have multiple systems, you would need to merge all your public keys into 1 file before uploading to your distant box. You can do so on another machine. You can use the following command:

cat hostname.pub >> merged.pub

then rename merged.pub to “authorized_keys”

Now your system is ready to connect with SFTP/SCP to the distant server without asking you for a passphrase or a password.

Test it by doing

sftp -o IdentityFile=/root/key/host user@hostname.domain.tld

You should be connected without entered a password or passphrase

3- Create a file with all necessary SFTP command to be used in batch mode

On your local server, do

touch /root/sftpcmd
vi /root/sftpcmd

enter these two lines

rm ~/yourbackupdir/* (or domain* or domain.tld)
exit

4- Create the transfer.ini file for the iworx backup command

This file is used by the iworx backup command to store you login information to the distant server

On your local server, do :

touch /root/transfer.ini
chown root:iworx /root/transfer.ini
chmod 640 /root/transfer.ini
vi /root/transfer.ini

enter these lines in your transfer.ini file

[siteworx.backup_transfer_data]
username=“user”
password=“your password”
hostname=“hostname.domaine.tld”
remotefile="~/yourbackupdir/"

Exit and save

5- Create the final script that call the SFTP and iworx backup commands

On your local server, do

touch /root/iworxbackup
chmod +x /root/iworxbackup
vi /root/iworxbackup

enter these lines in your iworxbackup script

#!/bin/bash
sftp -b /root/sftpcmd -o IdentityFile=/root/key/host user@hostname.domain.tld #to delete all previous backup files
/home/interworx/bin/backup.pex --domains=all --scp --transfer-ini=/root/transfer.ini #to store new backup files

exit and save.

The backup script is ready. It will delete all previous backup file, backups all siteworx account and send the new backup files to the distant server

you can test it for one or two domains before creating the cronjob

Edit the iworxbackup script and change the iworx backup command by

/home/interworx/bin/backup.pex --domains=domain1.tld,domain2.tld --scp --transfer-ini=/root/transfer.ini

Also, think to change the sftpcmd file if need. (to delete these two domains : rm ~/yourbackupdir/* (or domain* or domain.tld) )

Everything should work fine.

6- Create the cronjob

Now if everything was going fine you may create a cronjob that will call this script every day for example. do

crontab -e

and add this line

30 5 * * * /root/iworxbackup >> /dev/null 2>&1

This example will run the backup job every day at 5:30 AM

<<<<<<<<< END >>>>>>>

Hope this may help some of you.

Of course if you have access to your distant box with SSH, it will be more easy to create a script that connect to the distant server with SSH and execute a rm command before running the iworx backup command

Pascal

Great tip! Bravo! :wink:

Thanks Socheat.

I’m not sure it will help some interworx user, but for me it was important to find a solution to backup all the siteworx account on a NAS account (or Discsync, or whatever)

It works great, but sometime it doesn’t backup ALL siteworx accounts. Sometime (1 time every 5/6 times) it forgets to backups one or two siteworx accounts (it stops without kind of error)

Do you store a pid file somewhere per domain ?

If yes it May be because this pid file isn’t deleted !?!

Pascal

Check to see that the individual siteworx accounts do not have a backup scheduled roughly around the same time. Only one backup per domain can be running at any given time. If backup is running on abc.com, and a second backup tries to run, the second backup will abort. This might explain why sometimes a backup isn’t created. If you included --email=your@email.com, you will get an email after each backup is complete with any status/error messages that occured (one being “a backup is currently running, please try again later”).

There is a PID file created in /home/uniqname/backups/ that is used to determine if a backup is currently running. If the pid file doesn’t exist, it is created when the backup is run. If it does exist, backup checks to see if it’s stale (wasn’t deleted even though a backup for this domain isn’t currently running ). If it is stale, the file is removed and a new one created and backup continues. If it isn’t stale, then obviously a backup on that domain is running, and the backup aborts. Once backup is complete, it should delete the pid file. Even if it doesn’t, the script should clean it up the next time backup is run. If this isn’t happening, this is definately a bug, submit a ticket, and we’ll check it out.

Pascal - that is awesome! I was looking for exactly that, to use with Sago.

I’ve run into one problem… do you find the backups are REALLY slow? Like a 500KB backup takes say 2 minutes. Forget my 500MB or 1GB accounts…

Is it possible I did something wrong? Or is this normal for it to take this long using this method?

Int,

That is normal because it backs up everything and then TGZ it. The zipping can take some time. Then you have to transfer over the whole big file. Im waiting for the new structure only solution from interworx to implement my own script based backup.

Right now I just have one full backup of the SiteWorx account from a month ago and then do daily data backups via Rsync. With the structure only I will be able to have a real SiteWorx Settings backup either nightly or weekly as it wont take much time to do compared to the full backup.

I also updated this post here:
http://interworx.com/forums/showthread.php?t=997

Thanks for the reply Justec

I didn’t mean that the whole process takes awhile (though it does). My concern is that the actual uploading of the backup (.tar.gz created backup) file goes at less than 1K/sec. This is likely something I should discuss with Sagonet, but I just wanted to know if RSYNC somehow, for some reason, slows the transfer down to a crawl (perhaps because it is sending it over an encrypted connection?) I was under the impression that getting storage, using a server in the same datacenter, would result LAN transfer speeds and so these backups should be done within seconds (even if you’re transferring huge files)

Is there a port that I should maybe make sure isn’t being blocked by my (APF) firewall?

Rsync is encrypted via SSH, but that shouldn’t slow down the connection that much and I think if it was the firewall it would just totally block it.

First thing if you haven’t already done so would be to contact Sago b/c as you said you should be able to push at full speed of your server since the backup servers are probably somewhere else in the same facility.

You are rsyncing the siteworx TGZs correct and not just doing an rsync on directories manually correct (not that it should make a difference)?

Just an update, so that people don’t think your method is flawed =)

Contacted Sago and was told that my box was not properly “linked” to their network. He ran a mii-tool command fixed it. MUCH better

Love the HOW-TO. Thanks again =)

Pleased you like it :slight_smile:

Indeed a lot of sago clients had pbms with their card whom was set to half duplex and not full duplex.

For me, with the interworx backup, backuped 120 accounts take 5 hours and cpu grow up to 60%. Now I have a dual xeon and it is much better but it always take 5-6 hours.

I’m also awaiting for a new interworx backup structure and also much more to allow the restore command to automaticly link a siteworx account to the good reseler :-p

About this script it could be great to do a rotation. Also be carreful if there is a pbm during the backup the next backup may not be done. Check for a file named xx.pid in all siteworx accounts’ /backups dir

Pascal

I used the method above to perform remote (SCP) backups.

Today, just for fun, I decided to check up on the remote location to see if the backups were successfully being done. What I see on the backup site now is just 2 Siteworx accounts were successfully backed up. The others (about 7-8 others) were not backed up at all (file size is 0KB).

I checked the Cron email/log that I have it send me everyday. Except for the 2 successful ones, the others all have the following error:

Transferring via scp/sftp to REMOTE.FTP-LOCATION.com (~/backups//SITEinQUESTION.tld_mar.29.2006-04.20.04.tar.gz)
/home/SITEinQU/backups/SITEinQUESTION.tld_mar.29.2006-04.20.04.tar.gz is 578576.71KB in size. Transfer may take a while.
Error - last line returned was: scp: /backup/int/backups//SITEinQUESTION.tld_mar.29.2006-04.20.04.tar.gz: No space left on device at /home/interworx/lib/scp-expect/Expect.pm line 733 scp transfer failed. Please check your transfer settings

The first 2 did not display this error though. Both the remote location and my server have plenty of space available (WAY WAY more than the 500MB required for this transfer). Any ideas?

EDIT: BTW - no *.pid files anywhere in the /home/SITEinQU/backups folder

I keep getting the same random failed transfers:

No space left on device at /home/interworx/lib/scp-expect/Expect.pm line 733 scp transfer failed. Please check your transfer settings

Any ideas guys?

Hi,

What about trying to transfert them manually just to check ?

I’d backup them locally in a tmp folder then do a scp to try to transfert them

scp -i /your/key/path /your/backupsfiles/files.tar.gz user@host:~/path/

Just to test that your key file is ok and scp works.
I’m not sure backup.pex use this scp syntax but it may be a first step to test it like this

Pascal

[quote=pascal;3139]Hello,

I share with you what I’ve setup for the backups of all siteworx accounts.

I have a NAS account with SAGO to store all my backups.

So the idea is :

  • to store in this space (distant server) all siteworx backup files.
  • to create a cronjob and use the iworx backup command
  • to have only the last backup files. Not all previous backuped files.

The problem is that you can’t acces with SSH to this NAS space.
So you can’t create a script to delete all previous backup files.

The solution I found is to use SFTP to run some commands as “rm”.

So we first connect to this distant server with SFTP to delete all previous backup files, then use the iworx backup command with the scp option to transfer all the new backup files to the distant server.

One time again there is a pbm.
SFTP can use a file to run commands in batch mode but you can’t give it a clear password.
You have to use public/private key to login to the distant server.

So, we will have to :
1- Create dsa key
2- Move the public key to the distant server
3- Create a file with all necessary SFTP command to be used in batch mode
4- Create the transfer.ini file for the iworx backup command
5- Create the final script that call the SFTP and iworx backup commands
6- Create the cronjob

1- Create the DSA KEY

Login on root on SSH on your system.

You can replace host with your actual hostname (of your box).
When it asked you for the challenge phrase, let it empty and press enter twice

2- Copy the public key on the distant server

On your local server, do :

You may replace “host” with the hostname of your box. It has to be the same than 1-

Now we have to create a directory on the distant server.

Login using your favourite FTP client.
Make sure you set it such that you can view hidden files.

If your have multiple systems, you would need to merge all your public keys into 1 file before uploading to your distant box. You can do so on another machine. You can use the following command:

then rename merged.pub to “authorized_keys”

Now your system is ready to connect with SFTP/SCP to the distant server without asking you for a passphrase or a password.

Test it by doing

You should be connected without entered a password or passphrase

3- Create a file with all necessary SFTP command to be used in batch mode

On your local server, do

enter these two lines

4- Create the transfer.ini file for the iworx backup command

This file is used by the iworx backup command to store you login information to the distant server

On your local server, do :

enter these lines in your transfer.ini file

Exit and save

5- Create the final script that call the SFTP and iworx backup commands

On your local server, do

enter these lines in your iworxbackup script

exit and save.

The backup script is ready. It will delete all previous backup file, backups all siteworx account and send the new backup files to the distant server

you can test it for one or two domains before creating the cronjob

Edit the iworxbackup script and change the iworx backup command by

Also, think to change the sftpcmd file if need. (to delete these two domains : rm ~/yourbackupdir/* (or domain* or domain.tld) )

Everything should work fine.

6- Create the cronjob

Now if everything was going fine you may create a cronjob that will call this script every day for example. do

and add this line

This example will run the backup job every day at 5:30 AM

<<<<<<<<< END >>>>>>>

Hope this may help some of you.

Of course if you have access to your distant box with SSH, it will be more easy to create a script that connect to the distant server with SSH and execute a rm command before running the iworx backup command

Pascal[/quote]

Does this still work by changing the switches to the proper (new) switches in iworx 3.0.1: /home/interworx/bin/backup.pex --domains=all --xfer-ini=/root/transfer.ini --xfer-method=scp

I haven’t gotten automatic SCP backups to work since the upgrade.