An increasing number of power utilities are unable to meet current data center power demands. The rapid adoption of artificial intelligence and generative AI is compounding the issue. Power availability could thwart AI’s potential.
Drew Robb, writing for TechRepublic Premium, looks at how onsite power is growing in popularity as a way to address this issue.
Featured text from the download:
Modern processors require far more power than ever. The hardware they run on and the software they operate on are designed to utilize enormous amounts of energy. On top of that foundation, the LLMs that are the basis of gen AI must analyze billions of parameters in almost real-time — and that, of course, consumes power at record levels.
According to Uptime Institute, data centers are being built in unprecedented numbers at far higher compute densities. Average power densities per rack in 2010 were in the 4–5kW range, and by 2020, they had doubled to 8–10kW per rack. In 2024, 26% of data centers reported that they included some racks with densities in the 20–29 kW range or higher. Four percent said they had racks of 50 kW or greater. Mike Andrea, executive vice president of Oper8 Global, said several of his customers are demanding 80 kW to 200 kW per rack.
Boost your knowledge with our in-depth nine-page PDF. This is available for download at just $9. Alternatively, enjoy complimentary access with a Premium annual subscription.
TIME SAVED: Crafting this content required 18 hours of dedicated writing, editing, research, and design.
Source link
lol