Thwarting DDoS attacks is good for tier 1 content providers - and not for the reason you’re probably thinking. I was recently talking with someone who recently ejected from the content provider world, and he told me an interesting story about how peering relationships are actually impacted by DDoS (distributed denial of service) attacks. Let’s say for a moment that you’re providing hosting for some of the largest companies in the world. Chances are you are pushing a lot more packets out to users on the Internet than you are receiving.
It turns out that tier 1 peering agreements are set up in such a way that they want it to be as close to a 1 to 1 relationship as possible in terms of ingress and egress packets. If you as a content provider slip too far below a certain threshold you can get re-negotiated to a tier 2 or tier 3 status, giving you far worse rates, worse service level agreements and less quality of service. But the problem is if you are a content provider, you generally receive about 1K of data inbound and push out dozens or hundreds of packets on the outbound for each request. Hardly a 1 to 1! Kiss your tier 1 status goodbye!
So in comes a DDoS attack. Generally speaking DDoS attacks try to consume bandwidth or system resources. In either case generally they create a great deal of outbound packets as the systems try to respond to the DDoS attack. This would make the ratio of inbound to outbound packets worse if left unchecked, let alone the other more obvious negatives for your consumers in not being able to reach your sites!
Off you go to find yourself an anti-DDoS vendor to fit your needs and after implementing them you suddenly realize you are now no longer responding to the DDoS attack with packets from your side. Instead the ratio of packets becomes closer to equilibrium - as many inbound packets as outbound. Your inbound packets are a mix of good and bad traffic, of course, but maintaining that tier 1 relationship is key for many large content providers, so that same DDoS attack (and thwarting it) became a huge business asset suddenly.
Now, that leads us to the next obvious leap of logic - where any inbound automated traffic that can be blocked might be worth blocking, to a point, so that your levels of inbound and outbound traffic are stabilized. Web application firewalls fit into that mold, where they are able to send significantly less traffic if they block traffic that is not destined to a real person. Blocking the spiders and robots that won’t affect your marketing efforts with the search engines seems like an easy way to solve some of the massive outbound packet issues that cause content providers so much pain with their peering agreements.
Clearly the benefits are visible to anyone who penny pinches regarding their bandwidth bills, but this was an interesting new take on how to justify the cost of your security devices, if you needed another bullet in your justification to your boss.