TREND: DATA IS THE DRIVER
The wave is coming – and this time it’s getting bigger and faster!
We all predicted so many trends and used so many buzzwords about the upcoming growth of our industry. What was missing, though, was proof of concept about the applicability of the predictions. But we now have a real-world sample based on the pandemic in 2020, and it shows us how data reality even exceeds expectations.
Research from Telegeography predicted a CAGR in global data volumes from 2017 until 2024 of 45 percent on average. Let’s drill down into this and take my home base, Frankfurt, as an example. My neighbours from DE-CIX report an average growth on data throughput of about 20 percent from 2016 to 2020. 20 percent vs. 45 percent—ok, but let’s look at the details. The growth in average from 2.5 Tbit/s to 4 Tbit/s (i.e. 60 percent+) took three years to happen (2016-2019). The jump in average from 4 Tbit/s to 6 Tbit/s (i.e., another 50 percent+) took just a bit more—1.5 years (2019—November 2020). And the reported all-time highs went from 6.8 Tbit/s in February ’19 to 10 Tbit/s just recently on November 3, 2020. So, volumes are growing and the growth speed is rising. The exponential curve is rising and the prediction, in this case, becomes reality. Evidently, Frankfurt is not an outlier. Similar statistics in other DE-CIX places show the same trends.
We have now reached stage 4 on the platform development. It started with the traditional market places (stage 1), went to the sharing economy (stage 2), developed into cross-market / cross-functional data-driven ecosystems (stage 3) and now fully involves AI capabilities in stage 4. All this shows a similar perspective like the aforementioned DE-CIX traffic example. So, we actually came from prediction to reality in terms of data volumes. Global topics feed the hunger for data. What does this big wave mean for the digital infrastructure landscape?
IMPLICATION: DATA INFRASTRUCTURE UNDER PRESSURE
Riding the wave as good as you can – maintaining your balance.
Entering the infrastructure market is cost intense. And external sources push the go-green button, too. Greenpeace ranks our industry as practically no. 6 on the global scale of power-consumption. Let us just use one of the latest research reports to examine the data growth and the correlating power consumption. If we currently are at three percent+ of global power demand being fuelled by the global data center footprint, the German Environmental Agency has predicted that this percentage could rise up to 20 percent+ potentially over the next 15 years. At the same time German energy consumption, for example, is aimed to be reduced by 50 percent and fully sourced from renewable energies. Related to the data growth and a meaningful scope of technological advancements, that leaves us with way more than a demand for efficiency gains as per Moore’s law.
Consequentially, how can an investor make sure that given the pace of technological demands, the physical infrastructure can keep track or—at least—be a reliable home with decent levels of economical and ecological efficiency?
Contagi actually supports customers around the globe evaluating these kinds of questions. And we come across interesting concepts around the entire value stack of traditional infrastructure delivery. The common denominator of all these concepts interestingly enough is the parallelism of topics:
PERSPECTIVE: DECENTRALIZED ADAPTIVE POP-UP DATA INFRASTRUCTURE
Splitting the wave and using its magnitude.
Let’s change perspectives from the operator to the consumer view. Summarizing the Annual Global Data Center Survey from the Uptime Institute, there are a few key trends: Energy and processing demand rises, efficiency flatlines, the operating risk is more cost sensitive than ever and transparent clouds would help business, if they were more open.
With more than one third of the survey participants being exposed to a major IT outage within the last three years and three fourths of these outages being marked as preventable, there is clearly an issue of operating control for those who account for the availability of infrastructure.
For those responsible for the commercial part, the challenge is dependency. Big monolithic structures, whether a single cloud / hyperscale provider or one big infrastructure provider, in most cases lead to a David versus Goliath scenario.
How do we connect end user benefits and an optimal data infrastructure, which bridges the demand for high scalability, reliability and independence? Cross-functional thinking and a real-life example help to find the answer.