Backups fail for accounts larger than 2gb

I have interworx running on a centos vps and i have attempted to run nightly backups to a remote ftp backup space from siteworx. I have other accounts on the said vps and they backup fine everytime. Yet there is a 2gb account which always gets “stuck”. The backup is still running but it never completes.

I looked at the running processes after about 6 hours after manually starting the backup (using the backup now feature in siteworx). I could see the compression commands for the account still running and eating up lots of cpu. So i left it to run and the backup never completes.

Is there something that im missing about fairly large backups?

Thanks

Andrew

Hey Andrew,

You might check out this thread:

http://www.interworx.com/forums/showthread.php?t=1168

It may be a similar issue with PHP default limits.

JB

i have increased the time limit and the backup still fails. I can see the .data.tar.gz with the 2gb worth of files in, and then i believe the backup process then adds in the interworx.ini file to a final tar.gz, however this is never done. I can see the process is running but it never completes, am i missing something?

Is it possible that the /tmp directory is maxing out in terms of quota?

If your /tmp directory is it’s own partition and is not large enough this could be your problem. It needs to be at least the twice the size of your largest SiteWorx account.

If this is the problem, there is a workaround

create a new directory in your largest partition (most likely /home or /chroot)

mkdir /chroot/home/tmp

chmod 776 /chroot/home/tmp

then open /chroot/home/interworx/iworx.ini

under [tmp] and [iwprx home dir] (I think) there are two entries that look like this

tmp="/tmp"

edit them to look like this:

tmp="/home/tmp"

Save the file and restart iworx

service iworx restart

This should solve your problem.

As far as i can tell tmp is not on its own partition, the partition it is on has plenty of space. I ran the backup again and found that the process that gzips the data.tar.gz and the .ini file together to make the final backup zombied and never finished (as far as i can tell). Does the backup process keep logs?

If you’d like to open a ticket so we can take a peek, please feel free to do so. :slight_smile:

Thanks,
Socheat

[quote=awalsh;9197]I have interworx running on a centos vps and i have attempted to run nightly backups to a remote ftp backup space from siteworx. I have other accounts on the said vps and they backup fine everytime. Yet there is a 2gb account which always gets “stuck”. The backup is still running but it never completes.

I looked at the running processes after about 6 hours after manually starting the backup (using the backup now feature in siteworx). I could see the compression commands for the account still running and eating up lots of cpu. So i left it to run and the backup never completes.

Is there something that im missing about fairly large backups?

Thanks

Andrew[/quote]

Those backups use Tar. Tar will not work with files larger than 2 GB. We have a site that’s 30 GB. We had the same problem.