In this article, Edge Research Group:
- Summarizes the newly released State of the Edge report.
- Highlights a key use case.
- Offers an end-user perspective on edge services.
Microsoft and HP Enterprise have recently made headlines by talking about investing billions in products and services for edge computing. But beyond eye-popping dollar figures, there’s a lot of activity in the edge computing market.
With that in mind, Edge Research Group and Structure Research partnered to produce the State of the Edge 2018 report with support from Vapor IO, Packet, Ericsson UDN, Arm and Rafay Systems. The report covers a lot of ground in our attempt to assess the impact of edge computing on technology providers, telecom operators, ISPs, and most significantly, on developers and end-users.
The inaugural State of the Edge report:
- Assesses the state of edge computing today.
- Discusses trends driving the development of the edge computing ecosystem of technologies.
- Illustrates practical architectures for its deployment.
- Hypothesizes how a marketplace for highly distributed compute, storage and network resources might function.
Focusing on the end-user perspective
One of the areas the report focuses on is defining terms, and then extending the implications of those definitions into the market. (The glossary is now an open source project under the stewardship of The Linux Foundation).
Defining a view of what edge computing is, the report states:
Edge Computing: The delivery of computing capabilities to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services.
This definition allows for edge services to exist at different layers that extend from the ‘core’ or ‘central’ cloud. A few key terms from the that relate to our case study:
- The infrastructure edge refers to IT resources which are positioned on the network operator or service provider side of the last mile network.
- The access edge is the part of the infrastructure edge closest to the end user and their devices.
- The aggregation edge refers to a portion of the edge infrastructure which functions as a point of aggregation for multiple edge data centers deployed at the access edge sublayer.
The aggregation layer is useful in performance terms. At a high level, its purpose is to provide a reduced number of contact points to and from other entities or components in the web application architecture. A CDN, for example, is an entity that can act as an aggregation layer by providing a distributed infrastructure for caching content and performing functions on end user requests before requests are delivered back to a core ‘origin’ infrastructure.
Use cases for edge computing
A number of use cases for edge computing are outlined in the report. One of the use cases worth highlighting here is edge content delivery. Content delivery gets overlooked because of the excitement around topics such as autonomous driving, and perhaps is thought of as a problem already serviced by CDN vendors. They do offer an increasing array of solutions, including secure network and application access, cloud-based security such as web application firewall, and edge compute functions.
Where edge computing goes beyond traditional CDN services is in running customer-defined workloads in edge locations; and while some workloads will use functions that are short-lived, others will be stateful applications.
Our case study (not published in the State of the Edge report) helps illustrate the need for a new generation of edge content delivery services.
Challenge
The customer has a large-scale web and internet software business. Currently, applications are multi-tiered, with a request handling tier made up of high-availability proxy servers and load balancers that takes requests and routes them to a middle tier for further connection management chores before doling requests out to the application server tier.
The “front edge,” is the architecture is used for connection management, according to the customer. In internet services, higher performance establishes reliability and trust, driving more engagement with the consumer of the service. Performance thus becomes a key indicator of revenue. Security and availability have to be assured as well.
As more services are accessed on mobile devices, the challenge becomes how to have performance, security, and availability together. The devices can’t pre-cache all content; some of it is inherently personalized and context driven.
Solution
While CDN vendors have gone beyond static content delivery and are doing more around connection management, even to the point where most are doing DDoS and edge traffic protection (including bot management), it’s not easy to use these services because these edge services are not well defined-or at least are not defined in any standard way across the vendor landscape.
In the case of mobile access to services, one could move more application logic to an edge location near the user, but much of the statefulness (ie, stored data) of the application still has to be accessed from a database, and the customer states that they don’t envision being able to distribute their databases in any meaningful way any time soon.
The solution then is to aggregate vast numbers of requests and parse those to resources based on proximity. The experience, according to the customer, is rich and personalized but does still have to be done in a “smart way” to stay within performance boundaries. That means paying attention to web application and content loading performance vectors like time to first byte, loading a page above ‘the fold’, the time it takes to fully render a page, and the like.
Future plans and needs
“In companies like ours, even a 5% miss rate is expensive,” says the respondent, meaning that a request isn’t filled by the middle tier and has to go back to the main (or origin) server for data.
What the customer would like to do is run their own logic on a CDN-like set of network and compute resources. “Then you have general purpose computing, and then you can solve all problems.”
The problem is CDN and cloud vendors only have pre-determined functions available for use. Using logic that runs in the nearest cloud datacenter, even if a player like Amazon had thousands of locations, there’s still a big challenge for managing, monitoring, and updating the code. Another challenge is storage persistence – the customer wants to be able to do filtering and aggregation of data at the edge, but there is no way to store and process data at any point in the infrastructure edge.
The customer would like to see what he calls a “dynamic compute edge” that combines a request edge (the first hop from device to infrastructure edge, which would have general purpose processing) along with a “function” edge that’s capable of a minimal level of processing. That minimal level would include intelligent routing that sends the request on the best route to the place where a table lookup can occur, for example.
That kind of edge compute service would enable “whole new applications I haven’t thought of yet,” the customer said.
The State of the Edge 2018 report can be viewed and downloaded at stateoftheedge.com.