NVIDIA

In Conversation with Tony Paikeday, Senior Director, AI Systems, NVIDIA

Digital Realty recently announced a collaboration with NVIDIA and Core Scientific to deploy the industry’s first Data Hub featuring NVIDIA DGX A100 systems at the Interxion Digital Docklands Campus in London. With access to a new AI-ready infrastructure solution, businesses can rapidly deploy AI models in close proximity to their data sets globally, opening up a new artificial intelligence platform-as-a-service (AI PaaS) solution developed specifically for data science teams.

How does data impact AI development?

Most enterprises make a huge investment in data science talent to build and deploy AI applications. But there is a real gap between hiring data science talent and actually building AI models that are deployable in a production environment. Many of these AI models never actually move into production. AI is fundamentally different in how it is conceived, prototyped, tested, trained at scale and gets deployed. You are not just building a single, monolithic application, like in a traditional enterprise. Even when you deploy an AI model in production, you need a human in the loop to continually evaluate if this model is performing well, and models drift and degrade over time, because they are feeding on real live data from your operation.

Data is basically the source code that builds great AI models, and data gravity—which explains the nature of large data sets to attract applications and resources towards it—is critical for AI development. Many enterprises do not realize that if there is a lot of time and distance separating critical data from the computing infrastructure that needs to work on that, then you are going to immediately suffer the impact of data gravity.

Ian Ferreira, Chief Product Officer, AI, Core Scientific

What is the importance of Data Gravity on AI projects?

Many organizations lean on the cloud as a great way to engage in early productive experimentation. The cloud is very good for making a fast start, and supporting what I would call temporal needs that power early prototyping and development, especially as your AI project is starting to get underway. Over time, your AI model inevitably starts to get more and more complex through ongoing iteration. So, as you iterate and build a more complex model, it is consuming more and more computing cycles. In parallel, the data sets that feed the model training get exponentially larger. And this is the point at which your costs can escalate.

This is a fundamental data gravity problem that many organizations face, and it presents a speed bump and kind of an escalation in the cost of building AI. What ultimately happens is that the rate at which data science teams can build a better, higher quality, more creative model starts to slow down, while the costs rise, because they’re spending more time on it. When that happens, the quality of the AI model that you’re trying to deliver is affected. That is the inflection point at which many organizations realize that there’s a benefit to a fixed-cost infrastructure that supports rapid iteration at the lowest-cost-per-training run. But, how do you get there? You get there by moving your computing infrastructure to where your data lives. This is why we think the architecture and the offer put together by the combination of Digital Realty and NVIDIA is so valuable. You are eliminating time and distance between the data sets. You also are regaining control of your costs since you now have a highly deterministic platform that delivers incredibly fast performance, but in a predictable way.

This story is part of a paid subscription. Please subscribe for immediate access.

Subscribe Now
Already a member? Log in here

ABOUT TONY PAIKEDAY

Tony Paikeday is Senior Director of AI systems at NVIDIA, responsible for the go-to-market for NVIDIA’s DGX portfolio of AI supercomputers.  Paikeday helps enterprise organizations infuse their business with the power of AI with infrastructure solutions that enable insights from data.  Paikeday has also held key roles at VMware, where he was responsible for bringing desktop and application virtualization solutions to market, and at Cisco, where he built its data center solutions. Paikeday, who started his career as a manufacturing engineer at Ford Motor Company, holds an engineering degree from the University of Toronto.