As described in Traffic Concepts, out of the box Reblaze includes an Active Challenge process. This is very useful in distinguishing humans from bots.
Active Challenges work well, but an even better option is Passive Challenges.
Active Challenges will not occur when site content is served from a CDN. Passive Challenges can still detect bots in this situation.
Most importantly, Passive Challenges allow Reblaze to use Biometric Bot Detection—an advanced and sophisticated means of distinguishing humans from automated traffic sources.
With Biometric Bot Detection, Reblaze continually gathers and analyzes stats such as client-side I/O events, triggered by the user’s keyboard, mouse, scroll, touch, zoom, device orientation, movements, and more. Based on these metrics, the platform uses Machine Learning to construct and maintain behavioral profiles of legitimate human visitors. Reblaze learns and understands how actual humans interact with the web apps it is protecting. Continuous multivariate analysis verifies that each user is indeed conforming to expected behavioral patterns, and is thus a human user with legitimate intentions. More information about this.
If desired, the script code can include
These usually are not necessary, and their effect will depend on the placement of the script within the page. Their use is left to your discretion.
There are two primary situations where customers sometimes want to disable Active Challenges:
When a customer needs site analytics to correctly reflect all referrers. (Active Challenges can interfere with this.)
For API endpoints. Active Challenges are designed to verify the client's browser environment; for most API calls, there is no browser environment to verify. (For users of our Mobile SDK, this is not a problem. They can still use active challenges for these endpoints.)
Other than those situations, Active Challenges can be very beneficial.
If you wish to turn off Active Challenges, do the following.
For an entire site/application: remove the "Deny Bot" ACL Policy from all Profiles within the site.
For specific traffic sources: Add an ACL Policy that will 'Allow' those specific requestors. The requestors should be defined as narrowly as possible.
If you merely remove the Deny Bot ACL Policy from the relevant Profiles, then bots will still be excluded by the other active ACL Policies, Dynamic Rules, content filtering, and so on. If instead you added an "Allow" ACL Policy to specific requestors, then other ACL Policies will not block those requestors; they will be exempted from ACL Policy filtering.