Akamai Diversity
Home > Web Security > Scraper and Bot Series - When Good Bots Go Bad

Scraper and Bot Series - When Good Bots Go Bad

By Bill Brenner, Akamai SIRT Senior Tech Writer

Akamai this week launches the first in a series about bots and scrapers, based on continued research by Akamai's Security Intelligence Research Team (SIRT). In the first installment, we discuss the various types of bots and scrapers that we have encountered, and how you may want to react to each. This paper will mainly focus on the known "good bots", -- traffic that is encouraged because it can be helpful to a business.

Some background: Bots are scripts that can do a specific task on the web. Common bot types include denial of service, checking to see if a service is up, transferring files and many other purposes, both malicious and legitimate.

Web scrapers are one type of bot that collects web content for a specific purpose. A scraper makes requests to web pages and stores the data it collected. This data could be hotel or airline prices, store locations, current sales or any other valuable data. This stored information is analyzed and used for competitive intelligence, or sold to other parties.

IP blocking is rarely an effective defense, but the majority of malicious bots can be stopped by rate accounting. Targeted bots, however, require careful research and additionally planning to defeat.

Akamai observes bots of all types every day against virtually every industry vertical. The type of bot is often dependent upon the industry. But at a minimum, every site will have search engine bots, aka search spiders, making requests to their site. Some studies have shown that more than half of all Internet traffic is from automated sources and while some of this is welcomed by site owners, much of it is not.

Akamai will release one paper per month describing the various types of bots. In April, we will talk about malicious bots, the type that only try to harm a site by a means like DDoS.

In May, we will talk about highly aggressive scraping bots, trying to gather information as quickly as possible. These bots may be perceived as a DDoS attack, or actually can cause a denial of service.

Finally, in June we will wrap up the series with a paper on stealthy bots that utilize a "slow and low" methodology to avoid detection. These bots are often the account checkers or card validators looking to find valuable information to sell on the dark market.

This series is based off years of research by Akamai and will show how to handle each of these types of adversaries, as we defend against millions of these types of attacks on a daily basis.  

The first paper can be downloaded here.

To access the paper as well as other white papers, threat bulletins and attack reports, please visit our Security Research and Intelligence section on Akamai Community.

 

Leave a comment