How do you view Data Gravity and why should enterprises take note of it?
Because we have been able to virtualize compute storage, expand our capabilities and interconnect, and it is easy to lose sight of the importance in the way data should be handled. Data gravity is important because, speaking from a physics point of view, data has mass. An organization can extract value from it if it can ensure that the data end up in the right place at the right time.
Can you talk about the impact of data gravity on global financial capitals?
Traditionally, organizations did not have to worry about where data was located simply because so much of the core enterprise infrastructure was on-prem. All those data sources were located close to us. If we think about how organizations have been transformed, and what the process of digital transformation is all about, it is about expanding that enterprise ecosystem. That key data now has to be available in many different places.
If there is specific focus on the creation of data, then there will also be a need to have better access to that data. The financial centers are quite critical, as they are the source of all this data creation. Organizations are looking to leverage that data. If they are going to do that by performing analytics alongside their own data, not only is it critical to be close to where those sources of data are, but you also have to ensure that you can get your own data close to where the data analytics is happening.
How do you see data localization and data sovereignty laws affecting data gravity?
There has been increased awareness about the various aspects of privacy and compliance concerns around data. Organizations are starting to learn what that means from an operational perspective. Not only do you have to look at the operational concerns, but also how that data is being governed. That is going to be a constraining factor. So dealing with issues of data sovereignty and privacy management have become far more critical for organizations.
What is the impact of Digital Realty’s recently released Data Gravity Index DGxTM? How do you envision customers using it?
One of the challenges with data management is ensuring that organizations have an understanding of what is possible while being able to leverage industry-norm benchmarks. Having benchmarks that establish those kinds of norms is important for organizations to understand where they are and where they need to go.
ABOUT ERIC HANSELMAN
Eric Hanselman is the Principal Research Analyst at 451 Research, a part of S&P Global Market Intelligence. Hansleman coordinates analysis across the broad portfolio of 451 research disciplines, with a hands-on understanding of a range of subject areas, including information security, networks and semiconductors and their intersection in areas such as SDN/NFV, 5G and edge computing. He is a Certified Information Systems Security Professional, a VMware Certified Professional and a member of 451’s Center of Excellence in Quantum Technologies.