DOE and NVIDIA Warn Energy Gaps Could Stall AI Progress
The U.S. Department of Energy and NVIDIA are building a pair of AI supercomputers that would dwarf every existing system on the planet, but top officials from both organizations warned this week that sluggish electricity growth could hold back the very AI progress the machines are designed to accelerate.
Energy Secretary Chris Wright and NVIDIA Vice President of Hyperscale and High-Performance Computing Ian Buck laid out that case during a fireside chat at the SCSP AI+ Expo on May 7. The conversation centered on the DOE Genesis Mission, a federal initiative that pairs the department's 17 national laboratories with NVIDIA's full computing stack to apply AI directly to scientific research.
Two Supercomputers, One Unprecedented Scale
The partnership's most tangible output sits at Argonne National Laboratory, where two systems are taking shape. The first, called Equinox, is currently being assembled with 10,000 NVIDIA Grace Blackwell GPUs. According to Buck, those are the same chips and software powering commercial AI development today.
The second system, Solstice, will pack 100,000 next-generation GPUs built on the NVIDIA Vera Rubin architecture. Buck told the audience that Solstice would deliver roughly 5,000 exaflops of computing power, a figure he said is five times larger than the combined capacity of every machine on the current TOP500 supercomputer ranking.
"We're creating all the same technology, all the same hardware, all the same software building blocks used by all the major AI labs around the world," Buck said, "for all of world science to go get access to."
One early example of that access: an open-source NVIDIA AI model trained on 1.5 million physics papers and then further specialized on 100,000 fusion-specific studies. According to the NVIDIA blog, DOE researchers can query that model as a specialized AI agent to speed up their own investigations.
Why Electricity Is the Bottleneck
Wright framed AI leadership as inseparable from energy leadership. Over the past two decades, the U.S. has tripled oil output and doubled natural gas production, he noted, yet electricity generation has barely grown. For AI workloads that run entirely on electric power, that gap poses a direct constraint.
"We have to fix this bureaucratic and complex electricity grid so that it can grow fast," Wright said. "If we don't do that, we're going to slow down AI."
His department is pushing on multiple fronts. Wright said three small modular nuclear reactors will reach criticality by July 4 of this year, with larger reactors and additional SMRs planned after that. The DOE has also created a strategic fusion office, and Wright credited AI-driven computing with accelerating that research significantly.
Buck pointed to hardware-level efficiency as part of the solution. The jump from NVIDIA's Hopper generation to Blackwell delivered a 30x increase in raw performance and a 25x gain in performance per watt, according to his remarks at the event.
Accelerating the Grid Itself
Both speakers highlighted a feedback loop: AI needs energy to run, but it can also help build energy infrastructure faster. Wright specifically cited grid interconnection studies, approval processes that currently stretch across years, as a target for AI acceleration.
"With AI, we're going to take something that was years long and make it weeks or hours," Wright said.
Asked what tangible results the Genesis Mission should produce within 12 months, Wright pointed to fusion research, advanced materials and grid modernization. The full details of the fireside chat are available on the NVIDIA blog.