Regional »  Topic »  Powering intelligence: How AI training and inferencing are reshaping data center demands

Powering intelligence: How AI training and inferencing are reshaping data center demands


By Venkataraman Swaminathan, Vice President, Secure Power Division, Schneider Electric Greater India

The IT industry is in the middle of what’s arguably one of its most defining shifts to date, thanks to the explosive growth of generative AI. These powerful language models and tools are pushing the limits of traditional data center infrastructure, but the upgrades that operators need to prioritize will largely depend on which type of workloads they’re running: training or inference.

Training an AI model consumes enormous amounts of power (often more than 100kW per rack) and requires advanced cooling and electrical designs. AI inferencing workloads on the other hand, which were once considered less demanding, are evolving rapidly and becoming more complex. Inference has spread to a variety of environments: in the cloud, colocation facilities, on-premise data centers, and even at the edge. It’s at this inference stage where the real business value ...


Copyright of this story solely belongs to expresscomputer.in . To see the full text click HERE