Over the last 15 years, there's been a paradigm shift. Long gone are the days when you built and managed your own data center, were responsible for the physical hardware and the management overhead, and endured the high capital investment of the build and maintenance.
As soon as central cloud providers came on to the scene, it was easy to understand their value. No longer did you need to worry about upfront hardware capital investment or the management overhead of maintaining physical machines. For the first time, you could move to an elastic model where you could scale the necessary infrastructure up or down depending on your business.
Now, we're seeing the beginning of a third paradigm shift -- the move to developing on the edge. The edge has been an important part of any architecture for years, providing static content offload, optimizing images and videos, protecting against DDoS attacks, filtering bot traffic before it reaches origin, and much more.
This latest paradigm shift is driven by a desire to write code to solve use cases at the edge and ultimately, to build edge-native applications. Let's take a look at the reasons behind this new paradigm shift:
- Many of the new cutting-edge trends like IoT require processing to be close to where data is created in order to realize its full potential
- The future has been pulled forward, so the edge has become even more important in order to deliver fast, reliable, scalable, and secure digital-native experiences
But will the edge replace the centralized cloud as the centralized cloud replaced data centers? Absolutely not! If anything, they complement each other. While the edge is the ideal location for latency-sensitive workloads where data can be processed as soon as it's created, the cloud is where data is kept for the long term and where heavy compute workloads like machine learning model training take place.
We've moved from a model where an origin was offloaded with the help of a CDN to a model where you need to implement various use cases, create cloud-native applications, and create edge-native applications. The requirements of each use case will dictate the pros and cons of implementing it at the edge or in a centralized cloud.
The origin is no more -- all of our recent innovations have replaced it with a new architecture that consists of tiers of compute, the edge, and the centralized cloud.
What makes an excellent edge-native application?
- Consistent end-user experience: Regardless of where an end-user is geographically located, every user should have a consistent experience
- Low latency: Digital-native experiences and IoT use cases demand fast responses, so data needs to be processed as close to where it is created as possible
- Instant scale and reliability: With a single git push, you can achieve instant global scale and reliability
- Compliance: Now, with more people online than ever before, it's imperative to protect user privacy and adhere to government data regulations quickly and easily
A few weeks ago, we wrote about Akamai's 20 years of edge computing and outlined the edge-native products and applications we've developed to deliver faster and more secure web experiences. To do this, we've created the building blocks needed to build the most scalable, performant, reliable, secure, and compliant edge applications.
As we look to open up our platform to enable you to build edge-native applications, we will look to give you access to these same building blocks to build your own edge-native applications.
This week, we'll introduce our first step to enabling the creation of edge-native applications, Akamai's serverless compute platform: EdgeWorkers + EdgeKV + DevTools.
This is just the beginning. We're not stopping here as we continue to enable you to move more and more latency-sensitive workloads to the edge.
By combining Akamai's serverless platform with Akamai's delivery platform and IoT Edge Connect platform, you have the ability to build the world's most scalable, distributed, reliable, performant, and secure edge-native applications.