Note: This is the second blog post to our "Crush the Rush" holiday readiness webinar series.
We all know eCommerce is evolving. It used to be pretty simple. A shopper would visit your eCommerce Web application from her laptop or PC. You probably had to support one, maybe two browsers. But the world has changed - quickly. The fact is the proliferation of connected devices has changed the way we shop - whether it's couch commerce or show rooming - mobile devices have changed the game.
Yet it's not only mobile that has changed, the desktop/laptop environment has also evolved. In 2008 the different versions of IE had close to 70% of the browser market share. This is no longer the case with Chrome, Firefox and Safari growing significantly. Looking only at the browser families hides a lot of complexity; IE7 and IE8 are not the same browser. To get a more complete picture of browser development, check out Evolution Of The Web.
Mobile is growing fast. That is no longer news. We have all seen Mary Meeker's projections and eBay's mobile commerce retail volume numbers. And let's remember that mobile is not only smartphones - it includes tablets - in fact, some would argue they are the future of mobile commerce.
The fact that we no longer go online but are online has driven eCommerce growth. According to the IBM Digital Analytics Benchmark 2012 US online sales for Black Friday increased ~21% over 2011 and Cyber Monday online sales grew by ~30% over 2011. Online traffic trends over the years also show considerably bigger spike as more consumers look online for their holiday shopping. This also means that the cost of failures or slowdowns under peak traffic conditions just keeps getting higher.
Yet delivering fast, scalable Web apps, that keep getting bigger and more complex, to constrained devices over constrained networks, is no simple feat. It has gotten to the point where sites on the Gomez US Retail Website Performance Index require on average 30 hosts to deliver a home page. So what happens if one of these third parties has an issue? That depends on the architecture of the page and often it means a significant degradation of performance from an end-user perspective.
In this example - measured using webpagetest and Pat Meenan's great SPOF-O-Matic Chrome extension - page load time is significantly impacted by the third party performance issue. The kicker is that even though this isn't your fault directly, your customers will still likely hold you responsible for the degradation - and likely move on to the closest competitor - which is just a click away.
Compounding the complexity associated with Web app delivery are the ever-increasing end-user experience expectations. We have talked about this at length in other posts. If we don't meet those end-user expectations there are consequences. Real User Monitoring (RUM) made it easy to correlate performance to business metrics such as conversion, bounce or abandonment rates. Whether its data from vendors like Torbit or from companies like Walmart one thing is clear - the slower your pages the higher your abandonment and bounce rates and the lower your conversions. In other words, Web performance impacts the business. As far back as 2006, Amazon noted that speed matters. In particular, "Every 100ms delay costs 1% of sales".
So why are some sites faster than others?
Web performance optimization is complex and many factors impact page load time. Amongst the most important is your infrastructure and application architecture. Obviously the closer your content is to your end users the faster it will be. But you also need to look at how your application is delivered and rendered in the browser.
To oversimplify just look at the correlation of response time to the number of hosts, objects & KBs delivered. This oversimplification points us towards how we can improve the Web experience for your end-users.
Let's start with where the end-user response time is spent. Based on Steve Souders' Performance Golden Rule "80-90% of the end-user response time is spent on the front end. Start there." So knowing that the front end is important how can we optimize it? Souders also defined a set of 14 rules for faster loading web sites back in 2007 - one of them was use a CDN - getting your content as close to your end-users is as important as ever to reduce geographic latency.
However you should also start to embrace technologies and best practices that enable you to reduce requests made by your Web app. The fastest request is the one not made. Various techniques can be leveraged to achieve this ranging from data URIs to CSS/JS concatenation. If you have to make a request make sure that the bytes delivered are as small as possible (minification, compression etc.). And lastly also focus on accelerating rendering and improving perceived performance through techniques such as (async JS/CSS;DNS prefetching, etc.). Now trying to manually optimize the front-end is possible but for most organizations this is not an option due to the ongoing and complex nature of the different optimizations required across different situations. You can learn more about Front-End Optimization (FEO) here.
So how can you get insight into whether your optimizations are working and what the actual Web experience of your customers is?
You can start with synthetic testing - essentially a robot that will play back a prerecorded script in a clean room environment, in a datacenter, most likely on the Internet backbone. Monitoring providers often offer one or two instrumented browser versions, with a subset of monitoring locations and limited number of pages measured. True last mile testing takes this model and moves it from the datacenter to an actual end-users' PC - but it is still a synthetic robot.
In the case of RUM, Akamai can dynamically inject JS code at the edge and harvest performance data from your actual end-users. You can then access this data in the Luna control center and slice and dice it to get insight into the web experience of your customers. Timing metrics can be analyzed with time series and histogram charts and can be viewed across dimensions including:
• OS, browser, network name & type
• Continent, country, IP address & version
• CP Code, FEO & SPDY status
• Time range & URL
RUM can help you answer questions like "Are my most important users having a good experience?". It can even help you see how effective your current front-end optimization configurations are for all your users.
In this case a US retailer saw on average 65% faster page load times over DSA for US users. Not only did they get visibility at a macro level, they also had proof that they had optimized the experience of their US users across a variety of browsers and networks.
So how can you replicate this success and win this holiday season despite the challenges outlined earlier?
Start with adopting your customers' perspective. What is the actual experience of my end-users - and don't only look at this in a vacuum - how does it compare to the competition and Internet leaders?
Next, focus on delivering fast, consistent experiences across all your end-users' devices, browsers, networks and locations, making sure that performance doesn't degrade when it matters most - under peak traffic conditions when end-users are more likely to convert.
This dove tails with the last recommendation and that is to optimize for mobile first - in particular if you are exploring new Web development scenarios like Responsive Web Design. Even if you aren't, focus on the essentials, reduce the number of bytes delivered, reduce the number of requests made over the network by your application and make sure that you accelerate page rendering in the browser. And remember what is fast on a tablet will be even faster on a PC.