Jason Miller, our chief strategist of Commerce, was recently published in Retail TouchPoints magazine. In his article, How to Better Understand the Bot Ecosystem, Jason talks about the different ways to distinguish between good bots and bad bots and how the distinctions change across applications and environments.
Nearly all online businesses can be impacted by various types of bot traffic. This traffic may include scrapers that grab content or price information, automated "clicks" that fraudulently increase ad revenues and "transactional" bots that can be used to purchase limited availability goods and services, making them unavailable to legitimate customers.
Further, there are situations where the impact of bot activity on the business may be beneficial, while the impact on site performance is not. Based on analysis of traffic across the Akamai Intelligent Platform, upward of 60% of an organization's Web traffic may be generated by bots -- or programs that operate as an agent for a user or another program, or simulate human activity.
Many bots play a legitimate role in online business strategies. Others harm businesses by reducing competitive advantage, getting between an organization and its customers, or committing fraud. As such, organizations need a flexible framework to manage and better understand the wide array of bots accessing their web sites every day.
Jason also addresses the following questions in his article:
- What is a bot?
- Why are Web bots a problem?
- What is the best approach to reduce the negative impact of Web bots?
To read the full article, please visit this link - How to Understand the Bot Ecosystem.
Akamai will be exhibiting at the 2016 Internet Retailer Conference + Exhibition (IRCE) this week in Chicago, from June 7-10. If you're interested in hearing more about Bot Management or our Web Performance Solutions and Cloud Security expertise, we invite you to stop by Akamai booth #415 at IRCE this week.