In our ongoing commitment to protect your websites from unwanted traffic, we’ve recently introduced new ruleset in the Web Application Firewall (WAF) configuration named “Bad Crawler Protection.” This update replaces the legacy “RBL Protection” rules set. We’ve phased out the “RBL Protection” due to client complaints about its tendency to produce false positives and the complexities involved in debugging issues it caused.
Why We Replaced RBL Protection
While effective in some scenarios, the “RBL Protection” rules proved problematic for many of our clients. False positives were a common issue, which led to legitimate users being blocked from accessing sites. Furthermore, when issues arose, the process of understanding and debugging these problems was cumbersome and time-consuming.
Introducing Bad Crawler Protection
The new “Bad Crawler Protection” rules set aims to address these issues by focusing on blocking hits from automated bots and bad crawlers. This list is dynamically updated based on evidence and access logs that we receive. Users also have the flexibility to add exceptions to the blocked list by whitelisting specific rule IDs within the WAF settings.
Blocking AI Bots
A significant enhancement in the “Bad Crawler Protection” rules set is its ability to block AI bots. This move aims to reduce unwanted traffic and preserve your website’s resources. Currently, we block the following AI bots:
- mj12bot
- blexbot
claudebot - bytespider
- gptbot
- imagesiftbot
- ccbot
- chatgpt
Future Updates and Enhancements
We will continue to analyze access logs and add more bots to this list in future updates. This ensures that your website remains protected from new and evolving threats with just a single click.
Commitment to Website Security
Our dedication to enhancing website security and stopping bad traffic is unwavering. You can expect more related features and enhancements to be added to cPGuard in the future. Stay tuned for updates and ensure your websites are protected from malicious bots and crawlers.