Akamai Diversity
Home > Web Security

Recently in Web Security Category

Your January 2014 Patch Tuesday Update

Patch Tuesday is an important calendar item for Akamai customers, given how dominant Windows machines are in many companies. What follows is Microsoft's January 2014 Security Update. 

A New Resource for Training Kids in Internet Safety

I got a message this morning from an Akamai colleague who read yesterday's blog post on the HacKids security conference for children. He wanted me to know that he is doing something similar. Stefano Buttiglione, one of our senior solutions architects, says a school in his home town in Italy asked him to do a training course on the risks of social media to kids and their parents. It started as a one-day Danny Lewin Community Care event and blossomed from there.


HacKid Conference: Security Training for Kids

As I've written before, we in Akamai InfoSec take our security training very seriously. We also know that our success as a security operation depends on the skills and talents of the future. So when I see great examples of training for younger generations, I'm compelled to mention it here. For this post, the subject is the HacKid Conference scheduled for April 19 and 20 at the San Jose Tech Museum of Innovation.

Like Skipfish, Vega is Used to Target Financial Sites

Yesterday, we told you about how attackers were exploiting the Skipfish Web application vulnerability scanner to target financial sites. Since then, Akamai's CSIRT team has discovered that another scanner, Vega, is being exploited in the same manner.

Skipfish and Vega are automated web application vulnerability scanners available by free download. Skipfish is available at Google's code website and Vega is available from Subgraph. These are scanners intended for security professionals to evaluate the security profile of their own web sites. Skipfish was built and is maintained by independent developers and not Google. In addition to the code being hosted on Google's downloads site, Google's information security engineering team is mentioned in the Skipfish project's acknowledgements. Vega is a Java application that runs on Linux, OS X and Windows. The most recent release of Skipfish was December 2012 and Vega was August 2013.


Overview

According to Wikipedia, WordPress is a free and open source blogging tool and a content management system (CMS) based on PHP and MySQL, which runs on a web hosting service. Features include a plug-in architecture and a template system. WordPress is used by more than 18.9% of the top 10 million websites as of August 2013. WordPress is the most popular blogging system in use on the Web, at more than 60 million websites.

Attackers Use Skipfish to Target Financial Sites

Akamai's CSIRT team has discovered a series of attacks against the financial services industry. In this instance, the bad guys are exploiting the Skipfish Web application vulnerability scanner to probe company defenses.

Skipfish is available for free download at Google's code website. Security practitioners use it to scan their own sites for vulnerabilities. The tool was built and is maintained by independent developers and not Google, though Google's information security engineering team is mentioned in the project's acknowledgements.

In recent weeks, our CSIRT researchers have watched attackers using Skipfish for sinister purposes. CSIRT's Patrick Laverty explains it this way in an advisory available to customers through their services contacts:

Specifically, we have seen an increase in the number of attempts at Remote File Inclusion (RFI). An RFI vulnerability is created when a site accepts a URL from another domain and loads its contents within the site. This can happen when a site owner wants content from one site to be displayed in their own site, but doesn't validate which URL is allowed to load. If a malicious URL can be loaded into a site, an attacker can trick a user into believing they are using a valid and trusted site. The site visitor may then inadvertently give sensitive and personal information to the attacker. For more information on RFI, please see the Web Application Security Consortium and OWASP websites.

Akamai has seen Skipfish probes primarily targeting the financial industry. Requests appear to be coming from multiple, seemingly unrelated IP addresses. All of these IP addresses appear to be open proxies, used to mask the attacker's true IP address. 

Skipfish will test for an RFI injection point by sending the string www.google.com/humans.txt or www.google.com/humans.txt%00 to the site's pages. It is a normal practice for sites to contain a humans.txt file, telling visitors about the people who created the site.

If an RFI attempt is successful, the content of the included page (in this instance, the quoted Google text above) will be displayed in the targeted website. The included string and the user-agent are both configurable by the attacker running Skipfish. 

While the default user-agent for Skipfish version 2.10b is "Mozilla/5.0 SF/2.10b", we cannot depend on that value being set. It is easily editable to any value the Skipfish operator chooses.

Companies can see if they're vulnerable by using Kona Site Defender's Security Monitor to sort the stats by ARL and look for the presence of the aforementioned humans.txt file being included in the ARL to the site. Additionally, log entries will show the included string in the URL.

"We have seen three behaviors by Skipfish that can trigger WAF rule alerts," Laverty wrote. "The documentation for Skipfish claims it can submit up to 2,000 requests per second to a site."

Laverty said companies can blunt the threat by adjusting Summary and Burst rate control settings to detect this level of traffic and deny further requests. Also, a WAF rule can be created that would be triggered if the request were to contain the string "google.com/humans.txt". 

There is no situation (other than on google.com) where this would be a valid request for a site, he said. 

skipfish.jpg

Why I'm Attending ShmooCon 2014

Here at Akamai, we're busy preparing for RSA Conference 2014. It's the biggest security conference of the year, and we send a platoon of employees every time. Given our role in securing the Internet, it's a no-brainer.

But there are many other conferences we attend each year, because:

  1. We have a lot of information to share about attacks against Akamai customers and how the security team continues to successfully defend against them.
  2. We have to stay on top of all the latest threats and attack techniques so we can continue to be successful. Conferences are an important place to do that.
Next week, I'm attending one of the lesser-known conferences: ShmooCon 2014 in Washington DC. In recent years, I've found some of the best content at this event, and I've learned a lot. It's also an excellent place to meet other security practitioners that can become important allies. Some of the most important contacts I've made were at ShmooCon.

The unfamiliar usually chuckle or cock their heads in puzzlement when I tell them about ShmooCon. The name throws them off, and it's not a traditional business conference. ShmooCon is organized by the Shmoo Group, a security think tank started by Bruce Potter in the late 1990s. Attendees represent the full cross section of the security industry. There are hackers, CSOs, government security types and everything in between. More than a few people have compared it to the Black Hat conferences of old or a smaller version of Defcon.

The event has inspired a lot of thinking outside the box -- not just in terms of the talks, but in how attendees travel and network. In recent years people have carpooled to ShmooCon. For three years in a row I traveled to and from the event in what we called the Shmoobus -- An RV crammed with hackers making the journey from Boston to Washington DC. Those 12-hour drives made for a lot of bonding. With such a long trek, there's time to delve into deep discussions about the challenges of our jobs.

The Shmoobus is no more, unfortunately. But what I learned about security on those journeys will last a lifetime.

For more information about ShmooCon, visit the website. The full agenda is posted, including one of my favorite parts of the event, Friday-night "fire talks" -- 15-minute presentations where speakers are challenged to dive right into the core of their content.

I'll write about the talks and other ShmooCon events from this blog.

shmoocon_0.png
Two of the most prominent evolutions in the web application attacks landscape are scale and volume. Nowadays, attackers use tremendous amounts of computing resources such as those provided by cloud computing and botnets, in order to mount distributed large-scale attack campaigns over the Internet while keeping their identity hidden. From a security defense point of view, such attacks are a nightmare - they are much harder to detect and mitigate, as their origins are scattered and change rapidly. Current attempts to analyze a limited set of the malicious traffic usually results in incomplete understanding of the campaign, its nature and scale.

In this article we will show how the analysis of large-scale, global multi-site traffic may reveal interesting trends and malicious behavior patterns, and as a result can help improve protections against the next round of attacks.

Prior to initiating such distributed massive scale attacks, attackers try to compile a long list of vulnerable targets. In most cases they will target exploits in commonly used web application platform such as Joomla, WordPress or Drupal.

In a recent research that was conducted by Akamai's threat research team, using Akamai's security big data platform (Cloud Security Intelligence), the team came across a malicious campaign which focused on web applications with outdated modules of Joomla - one of the most commonly deployed content management systems. In this specific campaign, the attackers were trying to inject backdoors to the vulnerable web applications.

Deeper analysis into the attack campaign's traffic revealed that the attackers were trying to exploit Joomla's content editor, which allowed web users to upload files. This capability made Joomla susceptible to malicious file upload, and in turn to remote code execution. What the deep "single-event" analysis of the exploit did not reveal was the sheer volume and distribution of the attack. When the threat research team decided to zoom out and started looking for similar attack patterns across Akamai's customer base - they uncovered an entire botnet, exclusively "working" on attacks of this kind, slowly mining the internet for more and more vulnerable applications. Here are some of the key findings, after analyzing one month of security events, using Akamai's security big data platform:

Increase in Malicious Transactions Over Time

Looking at 43,000 malicious HTTP transactions over the time period of one month, we saw a constant increase in the amount of malicious traffic: 

Number of Attacks per day Or Katz.png

Botnet Distribution by Country

Looking at distribution by country of botnet machines - United States based bots were the most prominent:

Or Katz Blog Image 2.png


Botnet Distribution by Continent

Looking at the distribution by continent of botnet machines - Europe was the top continent from which bots were used:
Or Katz Blog Image 3.png


Increase in Number of Targets Per Day

Over the time period of one month 2,008 different web application were targeted. When looking at the chart, there is a clear trend of increase in the amount of web applications being targeted each day: 
Or Katz Blog Image 4.png


Analysis of Bot Machines 

When looking at the malicious bot machines that were being used to send attacks we noticed that most of these machines were actually internet-facing web applications - it was also obvious that attackers owned these web applications. 

Or Katz Blog Image 5.png

Further analysis of the botnet machines running web servers, showed that the prevalent server software was Apache. 

Or Katz Blog Image 6.png

Further Evidence on the Attacker's Identity 

While analyzing the botnet machines that participated in the attack, we found out that some of the machines were completely compromised and were installed with backdoor and remote control software. Other machines were also defaced and left with a hacktivist-type messages:

Or Katz Blog Image 7.png
The following image shows a backdoor on one of the compromised web servers giving attackers full control over the machine: 

Or Katz Blog Image 8.png
Summary

  • When looking at the behavior of web attack botnets over time we can see a clear trend of increase in the number of attacked application per day and HTTP transactions per day. These botnets are not static in size, and tend to grow over time, as hackers add more and more machines to the malicious network
  • Most of the bot machines in this attack came from the US and Europe - making geographic-based protections ineffective While geographic-based protections are futile, the fact that the majority of malicious bots were Internet facing web servers means that their IP address is static. This in turn, makes it easier to block them specifically
  • The ability to identify globally distributed malicious botnets based on behavioral analysis of multi-site security data can become a game changer in the battle for web application security
  • Correlating cross-domain attack information can help in the prediction of the next targets and therefore reduce risk to other applications that are still not under attack
  • Akamai's threat research team sees an increase in botnet-based attacks, which makes use of application-layer vulnerabilities as their primary weapon (as opposed to DDoS botnets) 

This post was written by Or Katz, Principal Security Researcher & Ory Segal, Principal Product Architect


Security Predictions? Here Are Some Facts About 2014

I've said it before and will repeat it here: I absolutely loathe security predictions.

I have nothing against those who make them. It's just that most predictions are always so much duh. The rest are marketing creations that have no attachment to reality. 

Examples of the self evident:

  • Mobile malware is gonna be a big deal.
  • Social networking will continue to be riddled with security holes and phishing attacks.
  • Microsoft will release a lot of security patches.
  • Data security breaches will continue to get more expensive

Examples of predictions that never had a hope of becoming true:

I'm going to offer you something different: Some facts for 2014. That's right, things that are really going to happen -- things that are not obvious to those outside of Akamai. Let's begin:

  1. In February, we will officially launch the first-ever Akamai.com security section, and it'll be packed with everything you need to understand the threats your organization faces and how Akamai keeps its own security shop in order.
  2. Several of us from Akamai InfoSec will travel the globe, visiting customers and speaking at many a security conference. Those who attend will walk away enlightened and inspired.
  3. Akamai will continue to protect customers from DDoS and other attacks.
  4. You will see many new security videos and hear many new podcasts from us.
  5. If you visit the soon-to-be-launched Akamai security section, you will walk away with a better understanding of our compliance efforts than ever before.
Happy New Year! May you have a healthy, prosperous and secure 2014.

2014.jpg

A round-up of the first nine episodes of the Akamai Security Podcast:

Episode 1: CEO Tom Leighton discusses the legacy of Co-Founder Danny Lewin, Akamai's role on 9-11-01, and his vision of Akamai as a major player in the security industry.

Episode 2:  I talk to Meg Grady-Troia about her role in Akamai InfoSec, particularly the security training she does for new hires. 

Episode 3: I talk to Larry Cashdollar, a senior security response engineer on our CSIRT team. Larry discusses the mechanics of his job and the particular threats he and the team have been tracking and defending against.

Episode 4: A few months ago, Akamai Senior Enterprise Architect David Senecal wrote a post about ways to identify and mitigate unwanted bot trafficIn this episode, I went into more detail on the subject with Matt Ringel, an enterprise architect in Akamai's Professional Services team. Check out the related post, "Bots, Crawlers Not Created Equally."

Episode 5: I interview CSIRT Director Michael Smith. We discuss the role of CSIRT in researching threats and vulnerabilities, as well as keeping customers and the wider public informed of defensive measures they can take.

Episode 6: I continue my discussion with CSIRT Director Michael Smith. In this installment, Mike describes the process by which CSIRT delivers daily threat intelligence to our customers, along with the defensive measures needed to block attacks.

Episode 7: In this episode of the Akamai Security Podcast, I talk to colleague, friend and Security Advocate Dave Lewis (@gattaca on Twitter). We talk about the past, present and future of his Liquidmatrix site, life in his new role and the big issues he's helping customers address. We also talk about all the blogging he's doing over at CSOonline.com.

Episode 8: This week's episode is with Akamai Senior Security Advocate Martin McKeay. He's an old friend with more than a decade of experience in information security. At Akamai, he spreads awareness about security and privacy, helping customers understand our approach to both. 

1274644_10202017815313383_1687459603_o.jpg