BFD Custom Rule to check apache logs for Wordpress failed logins

I’ve setup plugins on Wordpress to ban IPs, move the wp-login.php / wp-admin to another location (gives a 404 when trying the default), etc. This really helps protect the site itself, but with all the continuous request, even getting a 404 page, adds unneeded load on the server.

This is the most common line I could search for and block the IP:
216.73.117.42 - - [16/Oct/2014:09:36:24 -0400] “POST /wp-login.php HTTP/1.0” 404 25029 “-” “-”

I know because InterWorx stores each sites log I would have to setup a custom rule for each site I wanted to check for this, but the rule should just be repeated for each site just changing the path…so shouldn’t be too big of a deal.
Maybe in the future I can figure out a better, but this would be great to get setup on a few sites that are always causing issues.

I’m just having a hard time finding a tutorial that really explains how the rules are matching a flag and pulling and IP out. If someone could point me in the right direction there that would be most helpful!

Thanks,

Justin

Hi justec

Have you seen the posts by cleverwise. If not, you may want to as he has written some security I believe for WP.

I believe the rule just grep for matching variables, and you may be lucky and replace with not found 404.

I suppose you would be best adviced to change service start up to httpd etc, or you could leave as is, as this just checks service before turning on rule.

I could be wrong though, sorry and I’m thinking you could email the dev of bfd, which you’ll find the email address from licence text.

Lastly, I might think its a bad idea as it would block any matching, even if it were genuine, ie page not found due to wrong link or page been deleted etc…

I hope that helps

Many thanks

John

"Have you seen the posts by cleverwise. If not, you may want to as he has written some security I believe for WP. "
No. Are you talking about in these forums? I can just search by author?

"I believe the rule just grep for matching variables, and you may be lucky and replace with not found 404. "
Yeah, it does a grep to find matches I think and then some prints to dump the IP out. I can’t find much in the way of tutorials online, so I think I’m just going to play with the rules that are already there and try to figure out what’s going on.

"I suppose you would be best adviced to change service start up to httpd etc, or you could leave as is, as this just checks service before turning on rule. "
Not sure what you mean here?

Lastly, I might think its a bad idea as it would block any matching, even if it were genuine, ie page not found due to wrong link or page been deleted etc…
Well I could set the BFD threshold (before it blocks) to something like 50 or 100. When these bots hit the page and keep getting 404’d, there are a ton of them. Usually more than 1 per second.
I could also maybe find a way to only look for IPs over a certain period of time. So if it’s 100 in an hour block, but if it’s 100 over 1 week not block.

Hi justec

Sorry, here’s the post for cleverwise and WP link

http://forums.interworx.com/showpost.php?p=25365

The service startup looks for the service startup file, ie does it exist, if it does not exist, it will not run the rule. Httpd exists and will start the rule, but then as it only looks for service startup file, you could leave as is set on the file your going to copy and change, just so it starts the rule.

I’d appreciate if you could post your rule if you get it going, it does help other users, but I understand if you prefer not too

Theres also a post re honeypot http checks which you could use, it’s my post but I’d have to look it up later and or have you see. Licencecart post re blocking bots from htaccess.

Hope that helps

Many thanks

John

# failed logins from a single address before ban
# uncomment to override conf.bfd trig value
TRIG="100"

# file must exist for rule to be active
REQ="/home/site/var/site.org/logs/transfer.log"

if [ -f "$REQ" ]; then
 LP="/home/site/var/site.org/logs/transfer.log"
 TLOG_TF="site-httpd"

 ## HTTPD
 ARG_VAL=`$TLOG_PATH $LP $TLOG_TF | sed -e 's/::ffff://' | grep -E 'POST /wp-login.php HTTP/1.0" 404' | awk '{print$1}'`

fi


This is what I have put in for a single site as a test. For REQ I just made the same as the logfile, so as long as the log exist, it will scan it.

So basically the script looks for the wp-login.php 404 and if it finds it then prints (returns) the first column which is the IP address.

Hi justec

Sorry, where it states file must exist for rule to be active, this is the httpd run file, and not the log file

You’ll need to start bfd again with new rules included as well, bfd -s

I hope that makes sense

Many thanks

John

John,

This is only a check to see if a file exist, so it can be any file.
I use the logfile because that made the most sense since you can search a file that doesn’t exist, but otherwise I don’t even need this IF statement since it’s a custom rule for my server.

I think the point of that REQ is that the default rules that come with BFD install package don’t want to run if those services don’t exist on a particular server.

I did a test putting a fake IP address in the log with the wp-login stuff 10 times (temporarily change the TRIG to 10) and then ran the bfd -s and it caught it! So it looks like it’s up and running.

Hi justec

Ah yes, sorry I am slow sometimes sorry.

Glad it’s working as expected, and thanks for sharing, which hopefully may help others.

Many thanks

John

Haha, yeah, it seemed more structured when I first looked at it, but then I saw some rules that came with the install package that did use the logfile and then it click in my head :slight_smile:

Just funny how confused I was at the beginning of the day and how simple the solution was at the end. I guess in this case the logfile itself and the searched for string was pretty easy, which made my code easier than I thought.

More info for any reading this (and for myself when I google this 2 years from now and find my own post):

$TLOG_PATH
Path to the script that controls the scanning of the logs files

$LP
The actual log you are scanning

$TLOG_TF
This is like the name for the rule. It’s used to track where the log file was searched already I believe. Stores some data in a BFD tmp folder with that $TLOG_TF name, so I’m sure this name needs to be unique from all other rules.

sed -e ‘s/::ffff://’
Don’t know a whole lot about the sed command, but grabbed this from other rules. As far as I know this removes some mock IPv4 address attached into the last bits of an IPv6 address. don’t think I need it for this log, but just put it there just in case. I do have a few sites that are running on IPv6 already.

grep -E ‘POST /wp-login.php HTTP/1.0" 404’
This is the basic grep that searches for all the lines that have the wp-login.php and a 404 page I wanted to find

awk ‘{print$1}’
Basic linux command that prints out the first “column” of the text file which is the IP address in this log file

It seems that this is no longer working. I had a new site that was getting wp-login attacked and created a rule and nothing happen. The log file looks the same so not sure what’s going on. Any thoughts?

Hi Justin

I hope your keeping well

Do you mind me asking, if you run the grep manually, does it find it alright

Many thanks

John

John, I just gave it a try and the grep works correctly. So not sure which part is breaking the search from within BFD.

I got it figured out. At some point InterWorx change how the logs work. They make a log with the current day’s date in the filename and then symbolic link to it.
transfer.log -> transfer-2018-07-18.log

For some reason the BFD script doesn’t like that, so I found a work around to look for the actual file by inserting the date into it.
I created a date variable which spits out YYYY-MM-DD which is then inserted into the file name.


# failed logins from a single address before ban
# uncomment to override conf.bfd trig value
TRIG="30"

THEDATE=$( date +"%F" )

# file must exist for rule to be active
REQ="/home/site/var/site.com/logs/transfer-$THEDATE.log"

if [ -f "$REQ" ]; then
 LP="/home/site/var/site.com/logs/transfer-$THEDATE.log"
 TLOG_TF="site-httpd"

 ## HTTPD
 ARG_VAL=`$TLOG_PATH $LP $TLOG_TF  | sed -e 's/::ffff://' | grep -E 'POST /wp-login.php' | awk '{print$1}'`
fi

Hi Justin

Kudos to you. Well spotted and a big thank you for posting resolution

Many thanks

John

Thanks for this

Hi Justin

did you ever find a way to apply this rule to all sites in home directory ?
Got a few issues with some wordpress sites
Thank you

Hi Bear

I hope your well and keeping safe

I am not sure if Justec did but we use word defence plugin for WP, which bans

Many thanks and stay safe

John

I will give it a try,
thank you and stay safe

# wordpress admin login ban
# failed logins from a single address before ban
# uncomment to override conf.bfd trig value
TRIG="3"

THEDATE=$( date +"%F" )

# file must exist for rule to be active
REQ="/home/sites/var/my domain/logs/transfer-ssl-$THEDATE.log"

if [ -f "$REQ" ]; then
LP="/home/sites/var/my domain/logs/transfer-ssl-$THEDATE.log"
TLOG_TF="site-httpd"

## HTTPD
ARG_VAL=`$TLOG_PATH $LP $TLOG_TF | sed -e 's/::ffff://' | grep -E 'POST /wp-login.php HTTP/1.1" 200' | awk '{print$1}'`
fi

used Justin’s rule on some individual sites and obviously updated the log file path (my site uses ssl) added to the wp-login.php HTTP/1.1" 200 the 200 is incorrect login try’s
works really well

Nice, glad it’s working for you.

Thoughts on a master file to do all sites…
Since this is something read in by BFD specifically, not sure if it could do loops and stuff to be able to grab all the home page directories on the fly and build the paths, etc.
Maybe we could build a bash script that runs weekly and just creates all the config files by scanning the home directory and doing this.

But so far I’ve just been creating this as part of setting up a new site for a client.
Maybe there is a way to hook into the IW siteworx account creation to make this part of that process, so it doesn’t it automatically when you add an account?