InterWorx 1.9.0 Release Candidate Available

We’re looking for those early adopters out there that want to get the first crack at trying out InterWorx 1.9.0. If this is you, please fill out this form
https://secure.interworx.info/iworx-cp/support/rootdrop.php

Also please let us know what Linux Distro your server is running in the “Special Instructions” section of the form.

We will upgrade your InterWorx server to 1.9.0 for you and confirm via e-mail once the upgrade is complete.

Major new features include:
Clam Anti-Virus & SpamAssassin Integration
Full and Partial Account Backup & Restore / cpanel Import
Language Support
AWStats & Analog Support
Many user interface improvements

Paul

Alright, looking forward to the upgrade! :slight_smile:

My upgrade is complete and everything seems to be working great. I love the new stats and spamassassin integration. I haven’t looked at the backup/restore options yet, but I’m sure they’ll be very useful.

Just like to say I’ve been testing the Beta for a couple days and it is perfectly stable. Lots of nice improvements.

This morning around 8am I had a big spike in CPU use (up to 96%). I thought it had to do something with the updates, but checking the update section in nodeworx it doesn’t have anything new.

Everything is working fine, but just wanted to post (even though this could be totally unrelated)

Also, something I noticed my CPU use dropped from an average of about 10% to 5% after the upgrade (before the spike).

CPU.png

SpamAssassin

Not sure if this should be it’s own thread, but here it goes:

  1. If SpamAssassin is turned on in NodeWorx do I also need to go to each SiteWorx account and turn them on individually? I just checked one and it was off.

[EDIT]
Also, I have a few domains that I am just relaying through (2 of them have siteworx accounts 1 doesn’t). Mail is sent there then sent to my email server on port 26 (My ISP blocks port 25 :rolleyes: ). Will these be scanned too?
I am realying using SMTProutes:


domain.com:myemailserverIP 26

[END EDIT]

  1. The threshold value. I read that 5 is about default the higher the number the more spam gets through. I also saw something about rewriting the subject (in the Iworx help), but if the Spam is dropped at the SMTP then how would it ever make it to the user’s inbox? …or is there two different SPAM methods, one is to drop at SMTP based on the the number and the other is to mark it as spam but let it through?

I’m sure I could think of a million more questions, but I just need to read up on SpamAssassin’s before I do that, but these were two big things that jumped out at me.

Thanks for the help :smiley:

I asked Chris this same question yesterday. The Nodeworx setting is system-wide and overrides per-user settings. So if you turn it on in Nodeworx it is on no matter what for all domains. However, if you leave it off in Nodeworx then users can turn it off and on independently in Siteworx.

The spike in CPU is almost certainly the stats programs running - there are now 3 running by default instead of just 1.

The overall cpu usage drop probably has to do with the fact that we changed the storage calculating cron script to run 4 times per day instead of every 5 minutes. We found that it could be a bit of a resource hog depending on how many accounts and the amount of disk storage used.

Re: your question about smtproutes - I’d have to double check your setup. I assume you have some entries in the /etc/tcprules.d/tcp.smtp file for this relaying? If so, you can edit that file and add the QMAILQUEUE line to those lines, like it is in the bottom line of that file - that will caused that mail to be scanned as well.

Re: the SpamAssasssin NodeWorx vs SiteWorx setup -
The setting in NodeWorx causes e-mail to be scanned at the SMTP level, before local mail delivery begins - this allows you to drop messages if they’re above a certain SPAM score at the SMTP level if you choose to do that.
The bad thing about SpamAssassin scanning at the SMTP level is that if the e-mail that comes in has multiple recipients, there is no way to determine which recipient’s Spam Preferences should be used when scanning.
The scanning that is set up in SiteWorx performs the SpamAssassin scanning during the local delivery stage - when there is always one and only one recipient, and we can guarantee the Preferences will be set for that domain.
Therefore a server where SpamAssassin is “on” in NodeWorx and also “on” in at the SiteWorx level will result in an e-mail being scanned by SpamAssassin twice as it is delivered (once at the SMTP level, and again during local delivery), which may be a concern for folks running extremely busy mail servers - in which case you may decided to not run SpamAssassin at the SMTP level.

Is this spike going to happen like this everyday or was it becasue this was the first run?

By the way the AWstats is pretty cool :smiley:

The only thing I did was add the following line to my /var/qmail/control/smptroutes files:

domain1.com:myemailserver 26
domain2.com:myemailserver 26
domain3.com:myemailserver 26

Moved here –> http://interworx.info/forums/showthread.php?p=2609#post2609

We have had the following problems so far with the latest RC:

Oops. Nothing there :slight_smile:

Regarding spam thresholds, I read that same thing and wouldn’t even bother considering it. I run SpamAssassin at home and all decent mail with the exception of 4-5 messages over the past couple months have been under 2.0. I’m currently considering anything 1.4 or over as spam, but this is after a few months of training.

Of course, my spam is being sent to a spam box so I can review it. If spam is getting dumped, a settin of 4+ is probably a good idea.

Hello,

1/ about spam
About spamassassin, I ran it at both smtp level and local delivery level, and I never noticed an increased cpu load / load average.

About loading spamAssassin user preferences from an SQL database, note that this will NOT look for test rules, only local scores, whitelist_from(s), required_score, and auto_report_threshold. (see http://svn.apache.org/repos/asf/spamassassin/branches/3.0/sql/README)

I would like to also have :
The ability to load users’ auto-whitelists from a SQL database.
The most common use for a system like this would be for users to be
able to have per user auto-whitelists on systems where users may not
have a home directory to store the whitelist DB files.

and

The ability to store users’ bayesian filter data in a SQL database. The most common use for a system like this would be for users to be able to have per user bayesian filter data on systems where users may not have a home directory to store the data.

to use per user basis the sa learn, etc …

Maybe I’ll do a hack for these

As interworx client, you may also think to install razor, dcc, etc… to increase the spam detection :slight_smile:

Is the API to create siteworx account include the new spam/bounce message options ?

2/ about multi language
I hope some of interworx clients will create others language files as spanish for example.
Is the API to create siteworx account include the new language option ?

3/ about backup / restore
Great tools.
I know how to create a cron job to automaticly do a full siteworx account backup
But I’d like to do this for all siteworx account.
It could be great to give to the siteworx users the ability to create cron jobs to automaticly do full or partial backups of their accounts.

Just be carreful to not backup all your siteworx accounts if you have a lot (more than 20). It increases a lot the CPU load / load average and may cause your server unvailable (I have a PIV 3.0 with 2 Go ram, and my cpu goes up to 80% and load average up to 9.5 - 12 when I did a backup of the 40 siteworx accounts).
Maybe the use of hard link would be more efficients than rsync and gzip ?

4/ About stats
Great, even if it increases a litlle the cpu load (for me it is not really)


Just to be sure, I didn’t have any system/software updates since the Feb 19th is it ok ?


Well great job, I’m very happy as now my clients have siteworx in french :slight_smile:

interworx-cp is on the good way :slight_smile:

Pascal

Do the stats shut down the web server?

My webserver shut down around 8.30 this morning. It showed it was off and NodeWorx and just had to click start, but not sure why it would be off.

My email being relayed by the Qmail SMTProutes is being scanned so thats pretty cool.

The stats routine (and other normal ops) will cycle the webserver but it should never be “down” because of them. There is a failsafe restart mechanism when stats is run. I’d check the apache error log to see if there’s anything weird there. I’d also check to see when stats run on your server (the iworx user’s cron job will tell you when this occurs) to see if the times correlate.

Chris

I’ll let Paul chime in on your SA points Pascal but I wanted to address this question:

Just to be sure, I didn’t have any system/software updates since the Feb 19th is it ok ?

Yes, once 1.9 is released in full you’ll receive the later 1.9 updates.

Chris

Thanks Chris :slight_smile:

Well it is not so important, maybe it is better that Bayes auto learn from all domains rather one by one (union do force)

About spamassassin, I use it since a long time and I give you my parameters :

required_score 6

Allow SpamAssassin to rewrite the subject line of any messages it classifies as spam

This is the value that will prepended to the subject line of messages classified as #spam

rewrite_header Subject SPAM

Put spam analysis reports into to the headers of the message (rather than the body)

report_safe 0

Spamassassin by default will try and run these following spam-detection utilities

for every mail message. (You can read about them at http://www.spamassassin.org/dist/INSTALL)

We don’t want to waste any CPU cycles trying to run utilities that we don’t have installed,

so disable any that you havent installed.

use_pyzor 0

enable razor2 checking (if you installed it)

use_razor2 1

enable DCC checking (if you installed it)

use_dcc 1
dcc_home /var/dcc

Enable SpamAssassin’s RBL checking features :

Although there is already some RBL filtering in qmail’s rblsmtpd program,

it is still recommended to turn on RBL checking in SpamAssassin, as it will run

checks against a variety of different RBL sources, and the results will help

tag spam more accurately

skip_rbl_checks 0

If we haven’t received a response from the RBL server in X seconds, then skip that test

rbl_timeout 3

Now we want to alter some of the default scores for RBL hits

By default the bl.spamcop.net RBL score is 0 (disabled).

We will override this and give any hits a score of 3

Info about this RBL is available from http://spamcop.net/fom-serve/cache/290.html

score RCVD_IN_BL_SPAMCOP_NET 3

You can nominate any netblocks that you control, and contain mailservers that

you trust. IE you control the mailservers in these netblocks so there is no

need to be running RBL checks against these particular servers.

In this example below we are allowing the class-c 123.123.123.0 to go without

Spamassassin RBL checking

trusted_networks 123.123.123.

Pascal

Well Im about 99% sure (After Paul’s reponse earlier in this thread) that this is the stats running at around 8am because my CPU load goes close to 100% for a few minutes.

This is the 3rd day it has run and the webserver service was only not running on the second day, so maybe it was just a fluke with the stats or maybe had nothing to do with it, but I’ll let you know if it happens again.

Thanks