A fundamental part of traffic evaluation
Reblaze's Profiles are hierarchical structures, so that you can set up your security framework in a modular fashion. Rules and collections of rules can be set up once, and re-used throughout your planet as needed.
The hierarchy has several levels:
Profile
Policy
Rule
Condition and Operation
A Profile contains one or more Policies. A Policy contains one or more Rules. A Rule is a combination of a condition and an operation. Let's illustrate these with examples, from the bottom of the hierarchy upward.
Example conditions:
Is the request coming from a specific country? Or ASN?
Is the request coming from a specific IP address?
Is the request coming from a proxy? Or a VPN? Or a Tor network?
Is the request for a specific URI?
Is the request originating from an allowed (or a disallowed) HTTP referer?
Does the request contain (or not contain) specific arguments?
Is the request using (or not using) a specific method or protocol?
Does the request contain (or not contain) a query string in a specific format?
Does the requesting client have (or not have) specific cookies, or a specific cookie structure?
Possible operations:
Deny
Allow
Bypass (similar to "Allow", but the request will not be subject to further evaluation or filtering by other rules)
Example Rules:
If the requestor IP is within 131.253.24.0/22 [a known range of Bing crawlers], Allow.
If the requestor IP is within 157.55.39.0/24 [a known range of Bing crawlers], Allow.
If the requestor IP is within 1.10.16.0/20 [a range within the Spamhaus DROP list], Deny.
If the requestor IP is within 1.32.128.0/18 [a range within the Spamhaus DROP list], Deny.
If the requestor is a bot, Deny.
If the requestor is using a proxy anonymizer, Deny.
If the requestor's Company is [our company], Bypass.
If the requestor submitted an HTTP/S request, Deny.
Example Policies:
Allow Bing Crawlers [contains example Rules 1-2 above]
Deny requestors on Spamhaus DROP list [contains example Rules 3-4]
Deny bots [contains example Rule 5]
Deny proxy users [contains example Rule 6]
Allow users from our company [contains example rule 7]
Deny all requests [contains example rule 8]
Example Profiles:
Default:
Allow Bing Crawlers
Deny requestors on Spamhaus DROP list
Deny bots
Deny proxy users
Private area of our website, for internal use only:
Allow users from our company
Deny all requests**
** "Allow" Policies are evaluated before "Deny" Policies. When a match is found, no further evaluation is performed. In this example, company users will be Allowed, which exempts them from the Policy which Denies all requests.