InterWorx 1.9.2 Released

We’re happy to announce the release of InterWorx-CP version 1.9.2

This update will be applied automatically within 24 hours if your server has auto-updates enabled (the default). If you choose to perform the update manually we recommend logging into your server as root, and running the command:

yum update

If you have any problems with this update please open a Support Ticket

While this is a minor version update, there are quite a few changes worth mentioning.

SpamAssassin Integration Improvements

Both Horde and Squirrelmail webmail systems now have configuration options for some select SpamAssassin preferences, that each e-mail user can set.

SpamAssassin’s auto-whitelist and Bayes database is now automatically set up for each e-mail address.

E-mail users can train their Bayes database by putting Spam and Ham (non-spam) in two special IMAP folders, “Learn Spam” and “Learn Ham”.
Once per day these messages will be used to train the Bayes database, and then deleted.
Each time messages are used to train the Bayes database, the e-mail user will receive an e-mail detailing the results of the training session.

If an e-mail account has an IMAP folder named “Spam”, SpamAssassin tagged spam will be delivered to that IMAP folder rather than the Inbox.

There are two new e-mail settings in NodeWorx that the server manager can use to control the Bayes learning. There is an option to turn the daily Bayes training cron on/off (on by default), and you can limit the number of e-mail messages trained per-folder, each day (since the Bayes training can be resource intensive if there are a lot of messages and a lot of e-mail boxes to train) - default 250.

Backup System Improvements

SiteWorx users can now schedule a backup to happen in one of 3 intervals: Daily, Weekly, or Monthly.
If the server manager doesn’t want the SiteWorx users to be able to schedule backups of their site, this can be disabled editing the /home/interworx/iworx.ini file and changing the line

SiteWorx users can now specify alternate locations for the backup from the default.
The alternate location can be a different location in their account storage space (limited to the siteworx user’s home directory), or it could be a remote location, transferring the backup either via scp or ftp. The user is prompted for a username, password, hostname, and path on the host to transfer the file to.

Server managers with root access to the server can use the command line backup script to accomplish a number of tasks. Here is a listing of the new parameters to this script.

–file-path=/new/path/to/backup/ allows you to specify an alternate path to store backup files

–domains has replaced --domain, and you can use it the following ways:
– (backup 1 account)
–,, (backup multiple accounts)
–domains=all (backup all siteworx accounts)

–scp, scp the backup file to location in either the siteworx transfer.ini file, or in the global transfer.ini file (specified using --transfer-ini=)

–ftp, ftp the backup file to location in either the siteworx transfer.ini file, or in the global transfer.ini file (specified using --transfer-ini=)

this option, used when backing up multiple/all domains, sets a global transfer.ini file to be used. When backing up multiple domains, the individual siteworx transfer.ini files are ignored, and will only transfer the backups if this option is set.

Format of the transfer.ini file:


Other Miscellaneous features or bug fixes

FTP Bandwidth now counts towards each SiteWorx account’s total bandwidth.
Fixed bug with IP address changing (affected multiple domains on the same shared IP instead of just one domain)
Fixed public_html symlink so that it works in FTP clients now.
Fixed bug in Fileman when un-tarring files with very long path or file names.

So right now the only way to schedule backups is through siteworx.
Or we could setup a cron job with the command line version?

Is there a future (whenever its ready) update to have schedule setting in NodeWorx?

You crammed a lot of updates into this .2 update!

Keep up the good work :smiley:

You can set up a cron job as root with the command line version to schedule backups as often as you like.

Is there a future (whenever its ready) update to have schedule setting in NodeWorx?
Eventually yes, although in the mean time the command line version is pretty flexible, the only downside is you have to configure it manually (not via the web interface).


Cool, they are good improvements :slight_smile:

Any one about the follow-up bandwidth ?


For some reason auto updates do not seem to be working on my server. I have installed updates in the past by manually clicking on the updates button, however I don’t see any updates listed as ever having been installed.

I checked this morning and nothing was listed. I clicked the updates button and 15 of them showed up. Click the install button and it seems to begin to install them, but then just ends and I click the close window, and refresh the screen and all of the updates are still sitting in pending mode.

Is it acting as it should?

Bluesin, my guess is there may be some RPM dependency issue preventing the update from going through. I’d try logging into your server and running

yum update

That’ll either get things updated, or it’ll show you where the problem is. If you’d like us to check it out open a support ticket.


Quick question…

With the new SA setup what happens if someone uses both POP3 and webmail (which is basically IMAP).

If they are using Outlook Express at their office (which is like 80% of their use), but log into webmail every once in a while. I have told clients (that are interested) how to setup a filters using Outlook / Outlook Express to move incoming mail marked as spam to a local spam folder when using POP3.

Would they no longer be able to see their spam messages in Outlook Express because they are taken out of the inbox and put into the special IMAP spam folder? :confused:



As long as they don’t create an IMAP folder named “Spam” in webmail (or any other IMAP client) message tagged as Spam will still be delivered to their normal Inbox, and therefore will still be subject to their local filters.

If they DO create a Spam IMAP folder, the spam tagged messages will be delivered to the Spam IMAP folder instead of their Inbox, in which case they wouldn’t see spam messages in Outlook Express.

The Spam IMAP folder must be created by the user, it isn’t there by default.


Thanks for the quick reply.

I logged into SquirrelMail earlier today and I didn’t see a Spam folder there under inbox, but it was under “folders” when you click on that in the top menu.

Now I just logged in again and I see the 3 folders: Spam, Learn Spam, Learn Ham and I can tell you I did not create any of these myself. I didn’t even click subscribe under the folders, I just clicked on the folders link and then left that menu.

It makes sense that the user would have to create these 3 folders, but I know I didn’t and not sure how they got there.

Thanks for the help,


The “Learn Spam” and “Learn Ham” folders are created by InterWorx.

The “Spam” folder must have been created by Squirrelmail, I know InterWorx doesn’t create this folder.


Indeed, Squirrelmail does indeed create the Spam folder when you login by default. You can disable this by editing the following file:


change the line:

$special_spam_folder = 1;


$special_spam_folder = 0;



About your VERY GOOD improvments in BACKUPS option :

Is it possible to use a public/private key rather than a password when we do a backup using SCP ?


gd-devel is it. i downloaded with up2date and ran yum again…

No, not currently, but it should make it into the next major release, at the very latest.

Thanks socheat

An other question :
Is it possible to change the default backup file name ?

In fact I’d like to keep only the last backup files for the full backups.
Indeed, I store my full backups on a NAS and I don’t have too much space, so I’d only like to have the last backup.
If the name is always the same, does it erase the existing ones ?
I think you’ll say me it isn’t possible. As for a full backup of all domains I don’t see how I may give a unique name for all of them in the command line :-p


No, it’s not possible. :slight_smile: At least not on a multiple domain basis, unless you wanted to schedule each backup individually.

Perhaps you can write a cron script that deletes the previous day’s backup, and have it run sometime after the backup script and transfer is complete.