Bad Bots Bad Bots: What You Need to Know About Almost 30% of Internet Traffic
When you have a website of any kind you tend to think the more traffic the better. Which is true. There’s no twist ending here; more traffic is definitely better. However, not all traffic is good traffic. If you thought your uncle forwarding you emails full of every popular meme from 2011 is about as bad as it gets when it comes to internet behavior, wait until you get a load of malicious bots.
Bots Reign Supreme
After a three-year period in which humans made up the majority of internet traffic, bots snatched the crown back in 2016. According to DDoS mitigation provider Incapsula’s 2016 Bot Traffic Report, humans accounted for 48.2% of internet traffic with bots making up the remaining 51.8%.
There’s good news and bad news attached to that statistic. The good news is that it was an uptick in goodbot activity that put bots ahead of people once again. This increase in benevolent bot activity can be at least partially attributed to the busy schedule of feed fetchers, bots that bring content to web and mobile applications so that content can be displayed to users. The Facebook mobile app is a bigtime example of a feed fetcher.
Other types of good bots include search engine bots, which crawl websites in order to help determine search engine rankings, commercial crawlers, which extract data on behalf of digital marketing tools, and monitoring bots, which keep track of website availability as well as the status of online features.
The bad news attached to the bot supremacy statistic? That bad bots still outnumber good bots by six percent – 28.9% to 22.9%.
Image source: Incapsula
What’cha Gonna Do?
For the past five years bad bot numbers have held pretty steady, with 2016’s 28.9% a small decrease from 2012’s high of 31%. Of those bad bots, it has consistently been impersonator bots that are the most active of all the malicious non-human traffic, accounting for 24.3% of the traffic observed on Incapsula’s network as well as a staggering 84% of all malicious bot attacks on domains protected by Incapsula. In comparison, the second most common form of bad bot are hacker tools, bots that scan for site vulnerabilities and which accounted for 2.6% of the traffic on Incapsula’s network. Scrapers, which steal a site’s content, and spammers, which spam links into comment sections and discussion forums, make up 1.7% and 0.3% of traffic, respectively.
There are a couple of reasons impersonator bots are so prevalent. The first is that these baddies get the job done in a wide variety of ways. Since impersonator bots are built to disguise themselves as legitimate website visitors they’re able to wend their way around many website security solutions, which makes them a natural choice for many automated attacks.
The second reason impersonator bots are more active than their malicious brethren is that it’s impersonator bots being put to work in the now almost ubiquitous DDoS attacks. For a distributed denial of service attack, an attacker requires tens of thousands or even hundreds of thousands of bots organized into a botnet – a network of internet-connected devices that have been infected by malware in order to allow for remote use. Many DDoS attacks are perpetrated by impersonator bots making seemingly legitimate requests, but in such a number that a target website cannot withstand the influx of traffic, ending with the website being slowed down so much that it can’t be used, or taken completely offline.
As you can see in the infographic above in the section titled “What’s Your Bot Situation?” bot traffic is completely indiscriminate. A website owner may think his or her website is too small to attract bot traffic, or too large to be affected by it, but neither case is true. Bots descend no matter how much human traffic a website pulls in. This is a major consideration when it comes to securing your website, as is the fact that because so many bots are good and providing necessary services for your website, you can’t just block all bots in a bid to keep the bad ones away.
In order to protect your site from malicious bots while allowing good bots as well as human visitors through unimpeded, you need security that takes a multilateral approach to identifying and analyzing all website traffic. This often includes the use of a static analysis tool that digs into header information as well as structural web requests to determine if a bot is what it appears to be, a challenge-based security component that implements CAPTCHA requests and other tests, and a behavioral-based approach that analyzes a bot’s activity to make sure it meshes with its parent program, detecting any anomalies that may point to malicious intentions.
This level of bot protection can generally be found in advanced DDoS mitigation services, which use this level of granular traffic inspection to keep malicious traffic from ever touching a protected website’s network. Just think, by investing in quality DDoS protection, you could bump your uncle and his First World Problem forwards back to number one on your internet threat list.