In 2014, several successful malicious attacks against large financial services, government and private sector firms gave a clear indication of the changes occurring in the network security industry. The recent Ponemon Institute Cost of a Data Breach study found the average cost of a data breach to be $3.5 million with average cost per compromised record more than $145. (1)
Akamai's State of the Internet Report for Security, Q4, 2014, also indicates a rise in attacks with a 90 percent increase in DDoS attacks and 121 percent increase in infrastructure layer attacks over the previous quarter.
Despite having significant security measures in place, organizations have fallen victim to cyber-attacks. All of these organizations had the traditional, on premise, network security safeguards in place but still lost sensitive intellectual property.
Unfortunately, these attacks proved that reliance on traditional methodologies is not enough to stop the modern threat. Reactive mechanisms do provide a layer of security; however, knowing what threats lurk on the internet and protecting critical web infrastructure proactively from those threats can be invaluable.
Challenges in detecting threats flying under the radar
Protecting against attacks armed with advanced malicious threat technologies requires much stronger threat prevention techniques than just legacy systems that do not offer scale and impact performance. It requires an intelligence-based structure that aggregates and correlates information from a variety of unified threat management sources. It requires a unified platform that can analyze user behavior with internal data and external sources in order to determine if users on a network are doing their job or something more nefarious. This presents a set of challenges to organizations:
- Limited data sources: Companies simply do not have data sources that can capture data from across the globe. An IP, for example can be the source of malicious traffic on the other side of the globe and simply go unnoticed because organizations do not have the wherewithal to capture and flag that address
- Constraints in analyzing large datasets in near real time: While Big Data and analytic platforms for large data have been around for a while, organizations largely, are yet to come to terms with applying this to web protection. The reason for this is predominantly due to large investments that are needed for such a pursuit.
- Lack of heuristics engines: The application of heuristics has been prevalent in endpoint systems but their use in proactive web defense mechanisms is relatively limited
- Scarce expertise: Qualified security expertise is hard to come by and expensive to employ. This is a critical gap in security postures today. Once a threat is identified, the ability to create/push rules that plug vulnerabilities is critical, but very expensive.
Client Reputation and Proactive Defense Strategies
Client Reputation technologies better protect applications and web infrastructure against DDoS and application layer attacks. This is achieved by identifying and sharing with organizations the likelihood that particular IP Addresses fall into one the following "malicious" categories: web attackers, Denial of Service (DoS) attackers and scanning tools. Client reputation technologies leverage advanced algorithms to compute a risk score based on prior behavior as observed on a massively distributed network. The algorithms use both legitimate and attack traffic to profile the behavior of attacks, clients and applications. Based on this information, one can assign risk scores to each IP Address and allow organizations to choose which actions they wish to have their traditional defenses perform an IP Address with specific risk scores.
Should organizations pay heed?
The answer lies in understanding that a multilayered defense is key and such technologies add another layer of protection that complements existing defenses. These technologies also provide better input to critical security decisions. Such services also fill an important gap in defense postures; the forecasting of intent before exploitation.
In conclusion, there are a plethora of technologies available, each filling out a niche area and a specific need. Client reputational services gives organizations the ability to forecast a threat before being exploited, and that is an area that is well worth investing in, in order to maintain business continuity and minimize the impact of cyber threats.
As an aside, Akamai Technologies is participating at the RSA. Feel free to stop by and learn about our vision of cloud security in the months to come.
(1) 2014 Cost of Data Breach Study: Global Analysis - Ponemon Institute, May 2014
- - - - -
Sudeep Charles is an Akamai Product Marketing Manager in APJ