Each day many of us are plagued with slow page and app loads as it fetches information across the Internet. Have you thought about where our weakest link is in the chain? Is it the device? Is it the carrier? Is it where the data is being retrieved?
Depending on where an individual is located, two out of three of the aforementioned items can have a huge impact on user engagement. So, how and where can we make improvements?
Is decentralized architecture the answer? Maybe, but let’s talk through it.
Let’s give a little background on edge computing, so we are all on the same page. Edge computing, in its most basic form, is moving computation and data storage closer to the sources of data, rather than onto a centralized server or into the Cloud. The expectation of edge computing is to improve response times and save on bandwidth. For users, edge computing typically brings faster, smoother and consistent experiences. However, there are even further pros for enterprises and service providers. Edge computing could bring low-latency, high-throughput, flexible and scalable expansion, improved data security and real-time cost-effective optimizations and monitoring.
In order to achieve these benefits, the architecture of edge computing has to consider and include a number of restructuring components compared to that of the centralized on-premise or Cloud architecture models. Obviously, different business models and traffic profiles, as well as application infrastructure, could result in a variety of deployments of edge computing in the industry. However, a typical edge computing onboarding would normally examine and incorporate minimal services, with the associated data and storage moved to the edge (edge intelligence), a decentralized smaller footprint for a data center or colocation infrastructure (edge infrastructure), a connectivity capacity from edge to the core (edge connectivity) and platforms that build, deploy and manage edge applications (edge orchestration). These usually cover compute, storage, data management, data analysis, micro-datacenter or colocation infrastructure, networking and others.
So, where does edge computing exist in the industry? What are the areas that edge computing helps to best function over the Internet?
In 2020, the global edge computing market size is estimated at USD 4.68 billion, with an expected growth to expand at a compound annual growth rate (CAGR) of 38.4 percent from 2021 to 2028. The technology of edge computing is projected in an early development phase, which means there will continue to be vast change and evolvement of its architecture, implementation and operational models, as well as growth opportunities while it matures.
The COVID-19 pandemic and the new norm of working from home has driven the growth of edge computing and datacenters across both the globe and numerous business verticals. Looking at the industry verticals’ perspective, a few strong markets for global edge computing include: Energy & Utilities, Industrial, Transport & Logistics and IoT technologies from Smart Cities, Smart Homes and Smart Buildings.
Moreover, edge start-ups, along with hyperscalers, are pushing to develop more and more edge products and infrastructure provided by cloud vendors, as well as from micro-edge datacenters.
Is the Internet heading into a new era of decentralized models of many more edge points?
With the increasing growth in the edge computing market and given all its benefits, let’s evaluate what a decentralized Internet appears to be like and what it brings, compared to today’s model. The decentralized model of the Internet and/or cloud services would mean distributed control and infrastructure among many, rather than centralizing at the core. Various telcos, hyperscalers and major consumers of edge computing in the cloud would need to move in order to build small-scale modular data centers to host all the storage, computing and networking in distributed power and cooling enabled facilities, specifically with more on-ramps to the Internet. For hyperscalers and major consumers of edge computing in the Cloud, this will drive them to develop a generalized compute service, storage platforms for developers to deploy routing, security or business logic to application-layer proxies in the edge points. For platform providers and telcos, this would require flexible peering strategies and capacity in order to allow for cost-effective, throughput-efficient on-ramp connectivities and expansion at the edge points.