It's no secret the GPUs used to train and run generative AI models are power hungry little beasts.of the nation's electricity generation by the end of the decade. In light of this explosive growth, some have warned that AI infrastructure could
Chris Sharp, CTO at colocation provider Digital Realty, is well aware of the challenges associated with accommodating these workloads. Compared to the servers that run traditional workloads, such as virtual machines, containers, storage, and databases, hardware-accelerated AI is a different animal. A single rack of GPU servers today can easily consume 40 kW or more. Next-gen rack-scale systems from Nvidia and others will
According to Sharp, accommodating these demanding systems at scale isn't easy and requires a different way of thinking about datacenter power and cooling, which you can learn more about in this interview withIt's possible datacenters could end up looking wildly different. Sharp suggests small nuclear reactors and other primary onsite power generation may play a role. ®AI smartphones must balance promise against hype and privacy concernsMicrosoft to spend $3.