Smaller, more sustainable data centers might be more practical. Here's why
diginomica.comOne compelling narrative being shaped by Big AI has been that achieving AI dominance – whether against China or competitors – requires ever larger AI data factories. Powering these requires ever-larger power plants and wires, with the costs and environmental impacts likely passed on to ratepayers and neighbors. Indeed, even the current Trump administration recently invited Big AI to a White House energy pledge photoshoot.
Against this backdrop, Qoob CEO AJ Javan believes that smaller is a better bet in terms of predicted market size and systems engineering.
The shift toward smaller, distributed data centers near population centers is driven by a fundamental change in AI consumption, which is the transition from training-heavy workloads to an inference-dominant market. Inference is projected to account for the majority of all AI compute needs by the end of this year. Large remote GPU farms face increasing power grid bottlenecks with long interconnection queue times. Smaller ...
Copyright of this story solely belongs to diginomica.com . To see the full text click HERE

