With the seemingly sudden emergence of large-scale commercial artificial intelligence (AI) over the past year, especially new generative AI applications like ChatGPT, data centers have become even more central to the advancement of both our technological and connected societal future. Previously, high-performance and high-computing-power “users” of data center traffic—like streaming, gaming, and increases in mobile data access and speeds—has spurred innovation in data center design to accommodate increased power density demands. Now, AI is pushing that process even further—and faster.
The infrastructure needed to support the AI economy requires more power, data, and bandwidth than ever before, and with the rapid rate of adoption from small businesses to large-scale organizations, these infrastructure demands are only going to increase. Plus, scaling up infrastructure isn’t exactly a simple or straightforward process. So, the question remains: How can facilities that may have been built 20 years ago adapt to support such an influx?
The answer, like many aspects of evolving technology, lies in the data center industry’s ability to be agile.
Recalibrating existing space
Despite strong and continued development in regions around the world, the fight for data center space and capacity is becoming an increasingly uphill battle against demand. To offset logistical challenges like supply issues, construction delays, and space constraints, existing data centers are evolving to be highly modular and flexible. From the outside, they may look like massive brick-and-mortar buildings (quite literally) set in stone, but on the inside—from individual floors to individual sections and even to individual racks—they can be adjusted for changes in network topology, airflow considerations, liquid cooling and physical redundancy.
This agility is critical when customer requirements can change quickly in the face of widespread AI deployments. For example, only last year a data center operator may have been able to plan on an average of 10-kilowatt power draw per rack of customer equipment, but now the need for increasingly large blocks of 25, 50, or even 100-kilowatt (kW) racks at different places across that same data center facility will only continue to grow.
Refining core customer needs
Beyond the standard business concerns of cost and value, there are five core components of any relationship between a data center and the customers it houses:
- Bandwidth
- Ecosystem of service providers
- Power draw
- Resiliency
- Redundancy
With the emergence of AI creating additional demands on all these areas, data centers are drawing on more adaptable layouts and repurposing spaces to allow for the installation of more network circuits, switches, and routers. Over time, this will boost network bandwidth and enable the adaptation of different areas of the data center for new deployments.
Additionally, large, heavy cabling and advanced combined air and liquid cooling systems (which support more power draw and better resiliency and redundancy) are more easily installed in sections, which means data centers are naturally moving toward a modular power configuration.
This configuration—where data centers are conceptualized as a series of blocks or rooms with their own supporting power, backup, and cooling infrastructure rather than one large unit—means any of these core components can be refined in size and scope based on individual tenant deployment needs, even if a data center accommodates many diverse customers with varying application needs at different times.
Housing the future of AI
These are just a few examples of how a modular approach to data center design helps ensure that AI deployments, even at very high rack densities, can be supported in a highly performant, robust, and cost-effective fashion within an existing data center facility. Data centers’ ability to shift into this agile view of internal infrastructure will be the difference between being able to support current and future generations of AI deployments in existing sites and having to rely on building entirely new infrastructure. Sustainability, too, is of ever-increasing importance, and maximizing the abilities of existing sites and prioritizing reconfiguration efficiency over brand-new builds puts the industry in a much greener place. If data centers play their cards right, they will not only be the home of the AI revolution, but also its backbone. The biggest challenge—and one that will evolve and transform as rapidly as the technology itself—will be figuring out how to ensure compliance alongside moving targets of regulations. New demands and new parameters mean new boundaries, and it’s up to data centers to be able to keep supplying the foundation of support needed to withstand such an impactful, tumultuous, and unpredictable level of expansion.