A fundamental part of traffic evaluation
Version 2.14 has introduced a significantly changed workflow for Reblaze's traffic processing. Some of the features in this section of the UI are included mostly for backwards compatibility. Specific information is noted in the discussion of each feature.
As mentioned earlier in Security Section Concepts, Reblaze's decision-making can vary depending on the context. In a typical Reblaze deployment, much of the traffic evaluation is done using Profiles. When Reblaze receives a request for web resources, it first determines the Profile that is in effect for the resource that was requested.
Reblaze's Profiles are hierarchical structures, so that you can set up your security framework in a modular fashion. Rules and collections of rules can be set up once, and re-used throughout your planet as needed.
The hierarchy has several levels:
Profile
Policy
Rule
Condition and Operation
A Profile contains one or more Policies. A Policy contains one or more Rules. A Rule is a combination of a condition and an operation. Let's illustrate these with examples, from the bottom of the hierarchy upward.
Example conditions:
Is the request coming from a specific country? Or ASN?
Is the request coming from a specific IP address?
Is the request coming from a proxy? Or a VPN? Or a Tor network?
Is the request for a specific URI?
Is the request originating from an allowed (or a disallowed) HTTP referer?
Does the request contain (or not contain) specific arguments?
Is the request using (or not using) a specific method or protocol?
Does the request contain (or not contain) a query string in a specific format?
Does the requesting client have (or not have) specific cookies, or a specific cookie structure?
Possible operations:
Deny
Allow
Bypass (similar to "Allow", but the request will not be subject to further evaluation or filtering by other rules)
Example Rules:
If the requestor IP is within 131.253.24.0/22 [a known range of Bing crawlers], Allow.
If the requestor IP is within 157.55.39.0/24 [a known range of Bing crawlers], Allow.
If the requestor IP is within 1.10.16.0/20 [a range within the Spamhaus DROP list], Deny.
If the requestor IP is within 1.32.128.0/18 [a range within the Spamhaus DROP list], Deny.
If the requestor is a bot, Deny.
If the requestor is using a proxy anonymizer, Deny.
If the requestor's Company is [our company], Bypass.
If the requestor submitted an HTTP/S request, Deny.
Example Policies:
Allow Bing Crawlers [contains example Rules 1-2 above]
Deny requestors on Spamhaus DROP list [contains example Rules 3-4]
Deny bots [contains example Rule 5]
Deny proxy users [contains example Rule 6]
Allow users from our company [contains example rule 7]
Deny all requests [contains example rule 8]
Example Profiles:
Default:
Allow Bing Crawlers
Deny requestors on Spamhaus DROP list
Deny bots
Deny proxy users
Private area of our website, for internal use only:
Allow users from our company
Deny all requests**
** "Allow" Policies are evaluated before "Deny" Policies. When a match is found, no further evaluation is performed. In this example, company users will be Allowed, which exempts them from the Policy which Denies all requests.
Note that while Profiles are defined in this section of the Reblaze UI, they are assigned to specific resources/locations in the Security Profiles section of the Web Proxy page.