Backup script

Not sure if this is the right area but I have read through the threads talking about backup solutions. There are quite a few and all do different things so what is one more going to hurt. This little shell script does 2 simple things, 1 is it creates a full backup of all sites to a specified BACKUP_DIR and then deletes any that are more than N days old.

#!/bin/bash####################
BACKUP_DIR="/path/to/backup/dir"
DAYSOLD="7"
####################
#/home/interworx/bin/backup.pex --domains=all --file-path=$BACKUP_DIR

if [ -d $BACKUP_DIR]
then
  find $BACKUP_DIR-maxdepth 0 -mtime +$DAYSOLD -exec rm {} \;
else
  echo "$BACKUP_DIR does not exist or is not a directory."
fi

Thanks for the contribution, Peterz, I actually think this will be usefull to a lot of people :slight_smile:

Thanks! :slight_smile: What about having a script that does the same, but that has 2 parts:

One that transfers the backups to a remote location (handled by cron)

And the second which deletes files older than 7 days on the remote host. Not being knowledgable in making such scripts, it would be most welcome!

Cheers!

Obviously, the 1st part is already in the SiteWorx FTP backup (of course with schedule option).

If the 2nd part could be done with the FTP backup, then IWorx should provide support for it with nice end user options (the option could also be like ‘No. of backups to keep’).

I ran across a small bug in my example.

find $FTPDIR -maxdepth 0 -mtime +$DAYSOLD -exec rm {} \;

Which resulted in the find function not even looking because maxdepth was 0.

Change the code to this for it to actually delete the files

find $FTPDIR -maxdepth 1 -mtime +$DAYSOLD -exec rm {} \;