Blocking Unwanted Traffic
Even with good caching in place you might still get stumped. POST requests and requests with query string that are server side processed can be a problem, as well as requests for legitimate endpoints that uses server side resources.
Here are some battle-tested firewall rule concepts that have worked well in real-world scenarios:
Unwanted Requests
Block requests that you know your site will never handle. Attackers and automated probes often request files like .php, .asp, or known CMS paths looking for vulnerabilities. If your site doesn’t use these technologies, simply block such requests at the firewall, instead of letting them hit your web app and trigger 404s.
Unknown POST Requests
POST requests are a common attack vector because they skip CDN caching, so it's important you do something about that. The presence of the following two rules has taken care of several DDoS attacks in our setups all by themselves:
- Allow only known POST endpoints by explicitly whitelisting them.
- Add a catch-all rule that blocks all other POST traffic.
This simple two-rule setup filters out a large volume of illegitimate requests while keeping your genuine form submissions working normally. If attackers sniff out legitimate POST requests and uses those in an attack, at least you can block the attack by just disabling the allow rule for a while - if you can live with the post requests being unavailable, it might be better than the entire site being unavailable.
Non‑Conformant Image Requests
If you’re using ImageSharp in Umbraco for image transformations, make sure HMAC protection is enabled. This ensures each image URL carries a secure signature.
Then, set up a rule that blocks any image request under /media that:
- Includes a query string but lacks a valid HMAC parameter, or
- Uses query parameters outside your known, allowed ImageSharp configuration.
This helps stop attackers from generating endless transformation requests and consuming your server’s resources.
Umbraco Backoffice
The Umbraco backoffice should never be publicly accessible if you can avoid it.
Best practices:
- Restrict access by IP allowlisting (e.g., limit access to your company’s VPN or static external IPs).
- Use Umbraco’s load balancing capabilities to completely disable the backoffice on Internet-facing environments.
- Let editors connect through a private management environment instead.
This not only strengthens security but also keeps admin endpoints out of harm’s way during spikes or attacks.
Throttle the Rest
Even with all these protections, some unwanted traffic might still get through.
A throttling rule acts as a safety net, limiting the number of requests a single IP can make per minute.
But be careful — this can be tricky to tune. For example: One IP might represent a single company (e.g., a corporate NAT gateway), so legitimate users might share it. If each page load triggers ~25 requests (HTML, JS, CSS, images, etc.), just 10 users from the same IP could produce 250 requests per minute — potentially hitting your throttle limit.
To calibrate safely:
- Start with the rule in logging mode (many WAFs support this).
- Collect data for a few days or weeks.
- Adjust thresholds and move to enforcement once you’re confident it won’t block normal users.
If your logs show throttling across static assets (JS, CSS, fonts), that’s usually a sign your limits are too strict, because DDoS attacks, typically do not request static assets.
Also note that these rules may block AI bot scrapers. During the past 6 months or more we've frequently seen these scrapers requesting a large number of pages at so high frequencies that throttling rules are engaged. Whether that’s a problem or a feature may depend on your site’s visibility or stance on data harvesting!
In Summary
Blocking rules complement caching by dealing with traffic that can’t be offloaded. The key steps are to:
- Know your legitimate endpoints, and block as many illegitimate requests as possible
- For POST and query string processing endpoints explicitly allow only known endpoints.
- Throttle everything else.
And remember: many traditional POST submissions can be replaced with client-side API calls that keep your backend endpoints lean and cache-friendly. The fewer endpoints with heavy processing your server has, the harder it becomes for attackers to take it down using sheer traffic volume.