Wednesday, October 16, 2024

Why Big Tech is turning to nuclear to power its energy-intensive AI ambitions


The OpenAI app icon displayed along with other AI applications on a smartphone.

Jonathan Raa | Nurphoto via Getty Images

Technology giants are turning to nuclear energy to power the energy-intensive data centers needed to train and run the massive artificial intelligence models behind today’s generative AI applications.

Microsoft and Google are among the firms agreeing deals to purchase nuclear power from certain suppliers in the U.S. to bring additional energy capacity online for its data centers.

This week, Google said it would purchase power from Kairos Power, a developer of small modular reactors, to help “deliver on the progress of AI.”

“The grid needs these kinds of clean, reliable sources of energy that can support the build out of these technologies,” Michael Terrell, senior director for energy and climate at Google, said on a call with reporters Monday.

“We feel like nuclear can play an important role in helping to meet our demand, and helping meet our demand cleanly, in a way that’s more around the clock.”

Google said its first nuclear reactor from Kairos Power would be online by 2030, with more reactors going live through 2035.

The tech giant isn’t the only firm looking to nuclear power to realize its AI ambitions. Last month, Microsoft signed a deal with U.S. energy firm Constellation to resurrect a defunct reactor at the Three Mile Island nuclear power plant in Pennsylvania, whose reactor has been dormant for five years.

The Three Mile Island plant was the location of the most serious nuclear meltdown and radiation leak in U.S. history in March 1979, when the loss of water coolant through a faulty valve caused a reactor to overheat.

Why they’re turning to nuclear

Global electricity consumption from data centers, artificial intelligence and the cryptocurrency sector is expected to double from an estimated 460 terawatt-hours (TWh) in 2022 to more than 1,000 TWh in 2026, according to a research report from the International Energy Agency.

Researchers at the University of California, Riverside, published a study in April last year that found ChatGPT consumes 500 milliliters of water for every 10 to 50 prompts, depending on when and where the AI model is deployed. That equates to roughly the amount of water in a standard 16-ounce bottle.

As of August, there were more than 200 million people submitting questions on OpenAI’s popular chatbot ChatGPT every week, according to OpenAI. That’s double the 100 million weekly active users OpenAI reported last November.

Environmental opposition

Nvidia CEO talks AI energy demand



Source link

Related Articles

Latest Articles