AI data centers will drive $583B IT, networking and energy opportunities by 2030
Data centers that support AI will drive new business opportunities, such as IT, networking, and energy businesses.
According to a new CIR study, Networks and Power Requirements for AI Data Centers: A Ten Year Market Forecast and Technology Assessment, AI will drive $583 billion in opportunities.
The research firm said that by 2030, AI data centers from firms such as Google, Microsoft, and OpenAI will spend $420 billion on networking, storage, and server products based on new technologies such as Ultra-Fast Ethernet, Co-packaged Optics (CPO), and Silicon Photonics.
Energy will also continue to play a key role during this forecast period. CIR forecasts that another $162 billion will be spent on providing electrical power and data center cooling. Within a few years, much of the power for AI data centers will come from Small Modular (nuclear) Reactors, and the cooling will switch from industrial A/C to liquid cooling.
Despite the potential in these industry segments, CIR's President, Lawrence Gasman, cautions that latency remains a key issue.
"Whether AI data centers realize their full revenue potential depends on if latency can be held in check," he said.
Gasman added that edge networking, high-bandwidth technologies, and locating the data centers close to large users can all reduce latency. "If we can cut through latency," said Gasman, IT, networking, and energy-related revenues from AI data centers will reach $915 billion by 2034.
For related articles, visit the Data Center Topic Center.
For more information on high-speed transmission systems and suppliers, visit the Lightwave Buyer’s Guide.
To stay abreast of fiber network deployments, subscribe to Lightwave’s Service Providers and Datacom/Data Center newsletters.
Sean Buckley
Sean is responsible for establishing and executing the editorial strategies of Lightwave and Broadband Technology Report across their websites, email newsletters, events, and other information products.