Data Centers’ Doubling Power Demand Seen Stressing Energy Grids – EE Times

Data Centers’ Doubling Power Demand Seen Stressing Energy Grids


//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

An expected doubling in power consumption by the world’s data centers during the next few years is expected to strain the capacity of electricity suppliers, according to experts who spoke with EE Times. Those power constraints, without improvements in data center efficiency, will potentially impede the expansion of AI.

Electricity demand from data centers, AI and cryptocurrency miners will surge by 2026, the Paris-based International Energy Agency (IEA) said in a January report. After consuming an estimated 460 terawatt-hours (TWh) worldwide in 2022, data centers’ total energy intake could more than double to 1,000 TWh by 2026—roughly equivalent to the electricity consumption of Japan, according to the IEA. Updated regulations and tech improvements, including efficiency, are crucial to slowing the surge, the report said.

The electricity demands of a large data center—like some that are currently under construction in the U.S.—will be roughly equivalent to a city with one million people, according to Philip Krein, an electrical engineering professor at the University of Illinois, Urbana-Champaign.

University of Illinois Professor Philip Krein (Source: University of Illinois/Philip Krein)

“If you asked me a year or two ago, I would say a typical data center is going to a scale of 10 or 20 or 30 megawatts [MW],” Krein told EE Times. “That’s like an industrial park. We heard from a designer who’s been tasked by an industry client to design a 500 MW datacenter. That’s not an industrial park. That’s a significant city. The real concern is some of these data centers want to be able to ramp up and down arbitrarily. If you’re running a 500 MW plant and you’re allowed to ramp it up and down, zero to 100%? You can’t. By the time you get to 500 MW, the challenges are on a very different scale than we’ve seen in the past.”

Unlocking the Power of Multi-Level BOMs in Electronics Production 

By MRPeasy  05.01.2024

Neuchips Driving AI Innovations in Inferencing

GUC Provides 3DIC ASIC Total Service Package to AI, HPC, and Networking Customers

By Global Unichip Corp.  04.18.2024

Of the world’s more than 8,000 data centers, about 33% are in the U.S., 16% in Europe and 10% in China, according to the IEA. U.S. consumption will increase from around 200 TWh in 2022, or about 4% of U.S. electricity demand, to almost 260 TWh in 2026 to account for 6%. Increased adoption of 5G networks and cloud-based services will drive growth.

Few dare to estimate the energy consumption of data centers, cryptocurrency tools or gaming devices. That problem involves what’s called the “rebound effect.” The steadily increasing efficiency of electronic products like computers and smartphones, plus the more recent development of AI models, are fueling the explosion of data center services, and that boosts demand for electricity.

Electricity demand in a typical data center breaks down to computing, which accounts for 40% of the total, the IEA said. Cooling the server racks to achieve stable processing efficiency consumes another 40%. The remaining 20% comes from peripheral IT equipment.

In the U.S., the Energy Act of 2020 requires the federal government to study the energy use of data centers and promote their efficiency. The Department of Energy is supporting local production of semiconductors and funding for development of more efficient chips that cut cooling requirements. The state government in Virginia, with the world’s largest concentration of data centers, imposed requirements for better sustainability practices and carbon-emission reductions.

“If we have 10 or 12 or 15 places in the next year talking about 500 MW datacenters, it won’t be long before they’re getting pushback from regulators and utilities saying, ‘You just can’t do this,’” Krein said. “You can’t take a city-sized load and drop it in any particular place. Folks are going to have to figure out how to spread this out more regionally rather than trying to build these incredibly intense point sources.”

Saving energy

About 10% of the total terawatt budget for data centers could be saved through improvements in the thermal efficiency of semiconductors in power equipment, according to Adam Khan, the founder of Diamond Quanta, a startup diamond-material semiconductor venture in California.

“Diamond is much better than silicon carbide just from the thermal standpoint,” Khan told EE Times. “With our enablement of N- and P-type doping on simple power supply units, we can start to actually make a huge dent in terms of this budget. We’re not looking at replacing AI chips from Nvidia, but we’re looking at the power supply to these centers.”

Taking a different approach, Krein said data centers need a DC distribution system to cut energy demand.

“Instead of using conventional three-phase 60 Hz AC, some places actually rectify it right at the service entrance and then distribute, for example, 400-volt or +/- 350-volt DC right to the racks,” Krein says. “The intent is to cut down on a couple of layers of power conversion and save a fair amount of energy. You’re probably reducing the overall building power consumption on the order of 10%.”

Boosting supply

In addition to efficiency improvements, data center operators are looking for ways to increase electricity supplies that include small nuclear power plants located near the data centers.

Most of the electricity suppliers in the U.S. are highly regulated and cannot increase generation capacity without approval, Krein said.

“You worry about compliance and reliability and everything else,” he added. “It’s real hard for a utility to say, ‘Sure. 500-MW load. Come on down. We’ll build another plant’.”

Some data center operators have set up their own generation facilities to circumvent state regulatory issues, Krein said.

OpenAI CEO Sam Altman made headlines in recent weeks as the chairman of startup Oklo, which aims to build its first small modular reactor (SMR) in Idaho—potentially using nuclear power for the data centers that OpenAI and rival companies need to run AI training, inference and services. While land-based SMRs are in the early stages of development around the world, military submarines have used them for decades.

Krein says that land-based SMRs will not start operating in time to prevent an energy crunch emerging in the next few years.

“That technology is 10 or 20 years away, so that can’t somehow save us over the next five or 10 years of build out here,” he said. “People should put the magic bullet out of their minds.”



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.