A fundamental part of traffic evaluation
As mentioned earlier in Security Section Concepts, Reblaze's decision-making can vary depending on the context. In a typical Reblaze deployment, much of the traffic evaluation is done using Profiles. When Reblaze receives a request for web resources, it first determines the Profile that is in effect for the resource that was requested.
Reblaze's Profiles are hierarchical structures, so that you can set up your security framework in a modular fashion. Rules and collections of rules can be set up once, and re-used throughout your planet as needed.
The hierarchy has several levels:
- Condition and Operation
A Profile contains one or more Policies. A Policy contains one or more Rules. A Rule is a combination of a condition and an operation. Let's illustrate these with examples, from the bottom of the hierarchy upward.
- Is the request coming from a specific country? Or ASN?
- Is the request coming from a specific IP address?
- Is the request coming from a proxy? Or a VPN? Or a Tor network?
- Is the request for a specific URI?
- Is the request originating from an allowed (or a disallowed) HTTP referer?
- Does the request contain (or not contain) specific arguments?
- Is the request using (or not using) a specific method or protocol?
- Does the request contain (or not contain) a query string in a specific format?
- Does the requesting client have (or not have) specific cookies, or a specific cookie structure?
- Bypass (similar to "Allow", but the request will not be subject to further evaluation or filtering by other rules)
- 1.If the requestor IP is within 188.8.131.52/22 [a known range of Bing crawlers], Allow.
- 2.If the requestor IP is within 184.108.40.206/24 [a known range of Bing crawlers], Allow.
- 3.If the requestor IP is within 220.127.116.11/20 [a range within the Spamhaus DROP list], Deny.
- 4.If the requestor IP is within 18.104.22.168/18 [a range within the Spamhaus DROP list], Deny.
- 5.If the requestor is a bot, Deny.
- 6.If the requestor is using a proxy anonymizer, Deny.
- 7.If the requestor's Company is [our company], Bypass.
- 8.If the requestor submitted an HTTP/S request, Deny.
- Allow Bing Crawlers [contains example Rules 1-2 above]
- Deny requestors on Spamhaus DROP list [contains example Rules 3-4]
- Deny bots [contains example Rule 5]
- Deny proxy users [contains example Rule 6]
- Allow users from our company [contains example rule 7]
- Deny all requests [contains example rule 8]
- Allow Bing Crawlers
- Deny requestors on Spamhaus DROP list
- Deny bots
- Deny proxy users
- Private area of our website, for internal use only:
- Allow users from our company
- Deny all requests**
** "Allow" Policies are evaluated before "Deny" Policies. When a match is found, no further evaluation is performed. In this example, company users will be Allowed, which exempts them from the Policy which Denies all requests.