Online video streaming is everywhere today. It's available on the smallest hand-held devices to the largest of screens in our living rooms. Magically and for the most part, it works and provides a very captivating experience. However, if you step back to look at the overall architecture, media streaming is a very reactive process.
Playback starts with the player first requesting the manifest file. The request is handled by the CDN, which on an initial cache-miss will forward the request to the Origin. After the manifest is fetched from the Origin, the CDN will cache it and deliver it to the player.
Upon receiving the manifest, the player then requests the first media segment to start playback. This request is followed by additional requests to subsequent media segments within the manifest.
From a player's point of view, it's evident that the player will be requesting all these objects and in a particular sequence. However, this knowledge is not known to the CDN hence the reactive process of fetching objects from Origin.
The traditional request/response pattern looks like the following:
The concept of prefetching allows Akamai to predict and fetch a media object on Akamai's Edge right before it will be requested by the player. By pre-positioning content at the Edge, streaming becomes less reactive thereby improves the cache-hit rate as well as video segment" Download Speed orThroughput" and "Time To First Byte" (TTFB). The overall percent improvements however is a function of the last mile latency, the latency between player and Edge. The larger the latency, the larger the performance improvement.
The following graphs show 2 instances of the improvements prefetching made:
■ Average throughput improvement ranging from single digit to around 60Mbps
■ TTFB improvement from 10s to 100s of milliseconds
■ NOTE the 2 'red' bars where TTFB was not better and possibly due to upstream issues (CDN or Origin)
■ Average throughput improvement of around 28Mbps
■ TTFB improvement of 100s of milliseconds
How Prefetching Works
The following diagram depicts how prefetching works. The Edge prefetches Seg-1 as soon as it reads a "prefetch response header" from the Origin in the response for the Manifest.
The "prefetch response header" is a response header from the Origin named CDN-Origin-Assist-Prefetch-Path.
What we're Announcing
Akamai now supports segment prefetch as a standard feature of Adaptive Media Delivery and is available for live and on demand use cases as well as most popular streaming formats including HLS, DASH, Smooth and HDS. The feature requires assistance from an origin that can support prefetch response headers. Beyond prefetching of video segments, Akamai' also supports prefetching of variant playlists, keys, init-segments, etc.
Enabling Segment Prefetch can be done in as little as one click. For those that use Akamai's Media Services Live 4 as their live origin, the prefetch response headers are already supported to enable prefetching with Adaptive Media Delivery. This functionality is available as a Property Manager behavior called "Segmented Media Streaming - Prefetch". In this case, the image below shows how to turn on prefetching for Adaptive Media Delivery.
3rd Party Origin Integration
3rd party Origins are encouraged to integrate the Origin-Assist Prefetching functionality by following a simple spec documented on learn.akamai.com
To find out how your Origin can integrate with Akamai's Origin-Assist Prefetching functionality, read more at https://learn.akamai.com/en-us/webhelp/adaptive-media-delivery/adaptive-media-delivery-implementation-guide/GUID-DA7F358F-AFEC-4546-B6DC-CD8225423C51.html
To set the appropriate prefetch response header to get the biggest benefit from prefetching, read more about the sequence of requests that a player typically makes during a playback session.