Tetra-WebBBS Support Forum
Though there is little discussion on this forum I know from logfiles that there a couple of quiet regulars reading here. Thus I feel I should post some information for the regulars.
The short story:
If you suddenly have trouble accessing pages on this site and are taken to a "weird" looking page, then please don't worry. Don't retry instantly but instead come back after 12+ hours. Then you should be able to read again. If the same thing happens, retry again.
The site won't go down but I'm doing some testing here which I can't do on a site which isn't live. I'm clearing the blocks several times a day, but overnight European time it may take a while. So don't worry, but ignore and come back.
The full story:
Many of you know that I've always played with filters and such. A month back I found one of my servers overloaded. As always reason was automated calls to the server. I had some leisure time and though about the problem.
Actually I don't have much problem with unwanted posts anywhere. Almost all are caught by existing filters. However, I dislike to pay for resources which are wasted by all sort of blackhats. All this drain of resources by harvesters, unwanted blackhat bots and all such is a pain and causes a lot of traffic and thus requires a lot of CPU, RAM and bandwidth. I want to make it harder for those folks and save my own bucks.
Think: "why rent a better server or book a more expensive plan just to server those idiots?"
So the goal is not only to avoid unwanted posts but rather not to serve blackhat requests at all.
Currently I'm testing some sort of dynamic filtering on IPs.
Don't want to explain the details (and will never do). Reason is simple. Though those blackhat folks are idiots, they are not stupid and - from their perspective - just doing their job. Only, I don't want them to do their job in an easy manner. Thus I'll keep secret details and won't publish them or release any code.
However, it's worth letting the regulars know what game I'm playing here
I've fiddled some code to set up a network service on one of my servers. This forum site is the first to act as a client to my new network service. It dynamically queries the network service for each call made to the forum instances on this site. The network service performs some calculations and reports back if the IP calling the forum here is a blacklisted IP, a known search engine or if there is any other reason to take it for an automated, non-human request. Then the forums here can act in various ways and reconfigure on the fly. Either total blocks or falling back to search engine view or whatever.
The network service is fed by data from various sources and later will be able to serve all related forum sites.
The goal is to achieve dynamic banning of non-human clients calling "any" forum on "any" site which is registered network user to the network service.
This thing seems to work fine. There are still some quirks leading to blacklisting of innocent users. But none of the "good bots" like search engines is blocked. Accuracy is beyond 95% already without false negative.
So, if it happens you're taken for a false positive (non-human bot), then ignore it. Quirks will be sorted out as time permits.
As you can see it's impossible to test such a thing in any other way. It must be tested on a live site
As soon as the thing is running smooth here on this site I'll connect all my other sites and servers to the new service.
As a result I've cut down non-human traffic significantly on tetrabb.com. "Good bots" are served and the vast majority of bad bots is blocked.
I'm exited to see this dynamic thing serving multiple servers. Imagine a blackhat popping up here on this site and then being blocked on my other sites for their first call there. We can block such IPs over our entire (private) network of sites. .... and keep the code secret because it's only on the network service and won't be published anywhere.
Stay tuned folks
Ohhh.... last not least:
If you have any comments or questions, don't hesitate to post.