The Impact Of AI On Digital Infrastructure

How AI is going to change the future of data centers

Artificial intelligence (AI) has become an integral part of our lives, from the way we communicate to the way we work. AI has revolutionized the way we process and analyze data, and has enabled us to make more informed decisions. One of the most important developments in AI is the advent of large language models (LLMs). LLMs are neural networks that have a massive number of parameters, allowing them to learn complex patterns in language. These models are pre-trained on large amounts of data and can be fine-tuned for specific tasks. They have been used for a variety of applications, including language translation, text generation, and question-answering.

Transformers are a key development in language modeling. They are an architecture designed around the idea of attention, which makes it possible to process longer sequences by focusing on the most important part of the input. Transformers have had a profound impact on the field of natural language processing and machine learning. Their ability to handle long-range dependencies, facilitate transfer learning, generate language, and support multilingual tasks has propelled them to the forefront of cutting-edge research and applications.

Zaid Kahn, Vice President of Cloud AI & Advanced Systems Engineering, Microsoft

For transformers to be successful, powerful chips are essential. CPUs cannot support highly parallelized deep learning models, so AI chips that enable parallel computing capabilities are increasingly in demand. We are seeing these AI accelerators and GPU across the industry at a TDP of 700W per chip, and we will easily see this increase to at 1KW per chip in the near future. AI workloads, including generative AI, require increased computing power and memory bandwidth. To build better deep learning models and power generative AI applications, organizations need increased computing power and memory bandwidth and capacity. This has led to a growing demand for AI-specific hardware and investment in data center infrastructure. An essential element to AI is the hardware / software co-design. In the general processor or CPU world, developers would wait for a predictable roadmap in hardware and generational clock speed change and develop applications around those constraints. In AI the model is flipped, where software and model requirements are more closely considered before designing all hardware, and the success of software and model efficiency is very closely tied to the hardware.

This story is part of a paid subscription. Please subscribe for immediate access.

Subscribe Now
Already a member? Log in here