Welcome to the fourth Interconnection Insights article, and my final contribution to the series. I’ll soon be passing the baton to colleagues with expertise in other key areas of the data center industry.
In this article, I discuss three ways that AI is impacting data centers. You might be surprised that I will not focus on data center performance as related to data transfer or workloads, but instead on how AI can help data center providers address broad industry challenges and respond to the demand for more facilities, improving operational support, and implementing liquid cooling.
Taking on the Supply and Demand Imbalance
Do you know that many data center providers’ core business is being a real estate investment trust, or REIT? Data centers absolutely are hubs for interconnection, and where servers crunch data in hosted applications and private clouds. However, those things happen after a facility is built, a years-long process that involves everything from buying land to permitting and construction to commissioning.
The amount of time required to complete data centers matters because the demand for data center services jumped in 2021, primarily due to accelerated cloud adoption and AI. That led to increased construction starts, availability constraints in primary markets, and customers colocating in secondary markets. All signs point to even greater demand in the future.
Here’s one way AI can be used to accelerate data center construction: The first phase of building a data center includes developing a “30 percent spec,” which is a general drawing of the proposed facility that takes about 90 days to produce. A company that CoreSite engages for that service is training an AI model with construction data so the time for generating a 30 percent spec could be reduced to a handful of days instead of months.