Reblaze Product Manual

Enabling Passive Challenges

As described in Traffic Concepts, out of the box Reblaze includes an Active Challenge process. This is very useful in distinguishing humans from bots.
Active Challenges work well, but an even better option is Passive Challenges.
  • Active Challenges temporarily redirect the user's browser. This can affect site metrics gathered by products such as Google Analytics. (Specifically, the initial referrer information is lost.) Passive Challenges are simple pieces of Javascript. They do not redirect the user's browser; they merely ask it to solve a challenge, and then insert the Reblaze cookies.
  • Active Challenges will not occur when site content is served from a CDN. Passive Challenges can still detect bots in this situation.
Most importantly, Passive Challenges allow Reblaze to use Biometric Bot Detection—an advanced and sophisticated means of distinguishing humans from automated traffic sources.

Biometric Bot Detection

With Biometric Bot Detection, Reblaze continually gathers and analyzes stats such as client-side I/O events, triggered by the user’s keyboard, mouse, scroll, touch, zoom, device orientation, movements, and more. Based on these metrics, the platform uses Machine Learning to construct and maintain behavioral profiles of legitimate human visitors. Reblaze learns and understands how actual humans interact with the web apps it is protecting. Continuous multivariate analysis verifies that each user is indeed conforming to expected behavioral patterns, and is thus a human user with legitimate intentions. More information about this.
We recommend that all customers enable Passive Challenges if it is possible to do so. Biometric Bot Detection provides much more robust detection of automated traffic than is possible without it.


Implementing Passive Challenges is simple. Place this Javascript code within the pages of your web applications:
<script src="/c3650cdf-216a-4ba2-80b0-9d6c540b105e58d2670b-ea0f-484e-b88c-0e2c1499ec9bd71e4b42-8570-44e3-89b6-845326fa43b6" type="text/javascript"></script>
The code snippet can go into the header or at the end of the page. The best practice is to place it within the header. This ensures that subsequent calls contain authenticating cookies.
(This matters for the first page served to a visitor. Subsequent pages will already have the authenticating cookies to include with their requests.)
Most customers set up the code snippet as a tag within their tag manager. This makes it simple to install it instantly across their entire site/application.
If desired, the script code can include async and/or defer attributes:
<script async src="/c3650cdf-216a-4ba2-80b0-9d6c540b105e58d2670b-ea0f-484e-b88c-0e2c1499ec9bd71e4b42-8570-44e3-89b6-845326fa43b6" type="text/javascript"></script>
<script defer src="/c3650cdf-216a-4ba2-80b0-9d6c540b105e58d2670b-ea0f-484e-b88c-0e2c1499ec9bd71e4b42-8570-44e3-89b6-845326fa43b6" type="text/javascript"></script>
These usually are not necessary, and their effect will depend on the placement of the script within the page. Their use is left to your discretion.


To test the implementation, use a browser to visit a page containing the Javascript snippet. Once it runs, the browser should have a cookie named rbzid.

Disabling Active Challenges (Optional)

There are two primary situations where customers sometimes want to disable Active Challenges:
  • When a customer needs site analytics to correctly reflect all referrers. (Active Challenges can interfere with this.)
  • For API endpoints. Active Challenges are designed to verify the client's browser environment; for most API calls, there is no browser environment to verify. (For users of our Mobile SDK, this is not a problem. They can still use active challenges for these endpoints.)
Other than those situations, Active Challenges can be very beneficial.
We recommend that you keep Active Challenges enabled if possible. They automatically eliminate almost all DDoS traffic, scanning tools, and other hostile bot traffic.
If you wish to turn off Active Challenges, do the following.
  • For specific URLs/locations: remove the default "Deny Bot" ACL Policy from all Profiles that are assigned to those locations.
  • For an entire site/application: remove the "Deny Bot" ACL Policy from all Profiles within the site.
  • For specific traffic sources: Add an ACL Policy that will 'Allow' those specific requestors. The requestors should be defined as narrowly as possible.
Turning off Active Challenges will disable the direct blocking of bots (where a requestor is blocked merely for being identified as a bot). However, automated traffic will still be excluded via all other relevant means.
If you merely remove the Deny Bot ACL Policy from the relevant Profiles, then bots will still be excluded by the other active ACL Policies, Dynamic Rules, content filtering, and so on. If instead you added an "Allow" ACL Policy to specific requestors, then other ACL Policies will not block those requestors; they will be exempted from ACL Policy filtering.
If you have not enabled Passive Challenges (and successfully tested them), disabling Active Challenges is not recommended.