Power Magazine
Search
Home Research and Development The Genesis Mission: How AI Supercomputing Is About to Reshape American Science and Energy

The Genesis Mission: How AI Supercomputing Is About to Reshape American Science and Energy

The Genesis Mission: How AI Supercomputing Is About to Reshape American Science and Energy
Dr. Dario Gil, the Department of Energy’s (DOE’s) Under Secretary for Science, lays out a bold vision to double the productivity of U.S. research and development (R&D) within a decade—and explains why energy and artificial intelligence (AI) are two sides of the same coin.

After 22 years at IBM, where he rose to senior vice president and director of IBM Research, Dr. Dario Gil now leads one of the most ambitious science and technology initiatives in a generation. As director of the Genesis Mission, Gil is orchestrating a convergence of high-performance computing, AI, and quantum computing aimed at fundamentally transforming how the nation does science and engineering.

As a guest on The POWER Podcast, Gil explained what the Genesis Mission is, how it works, and why its implications extend from fusion reactors to the Texas power grid. Here are the key takeaways.

A Simple Thesis with Enormous Ambition

The Genesis Mission rests on a straightforward premise: a computing revolution is underway, and the U.S. should harness it to double the productivity and impact of its trillion-dollar-a-year R&D engine within a decade. Launched by President Trump shortly before Thanksgiving last year and chartered through the DOE, the initiative is built on three pillars.

The first is a platform for accelerating discovery, anchored in what Gil calls “the triad” of high-precision, high-performance computing; AI supercomputing; and quantum computing. An agentic AI framework layered on top of this infrastructure will allow scientists to execute complex research workflows at speeds that were previously unimaginable.

The second pillar is a portfolio of national challenges—real-world problems in energy, physical sciences, and national security that serve as proving grounds for the new AI-assisted methodology.

The third is a university engagement effort to rethink how future engineers, physicists, and scientists are educated in the age of AI.

Fusion Energy: From 50 Years of Data to Surrogate Models Running 10,000x Faster

Gil offered fusion energy as a prime example of how AI can compress timelines. For decades, the fusion research community has built exquisite experimental datasets and developed high-performance computing simulation codes that closely match real-world observations. The problem is that those simulations are computationally expensive—some take days, weeks, or even months to run at the desired level of fidelity.

Enter surrogate models. By training neural networks on the output of those validated simulations, researchers can produce AI-based models that issue predictions thousands to tens of thousands of times faster. The practical consequence is transformative. Engineers can now iterate on fusion reactor designs, exploring different configurations, materials, and operating parameters, in hours or minutes rather than days, weeks, or months.

Beyond design, AI is also being applied to real-time plasma control. Gil pointed to collaborative work involving Google DeepMind and Commonwealth Fusion Systems, among others, where AI optimizes the operating parameters of reactor plasmas to improve stability and power output.

The Grid: Completing 20 Years’ Worth of Simulations in Two Months

Some of the most immediately practical applications of the Genesis Mission involve the nation’s electrical grid. Gil shared two striking examples.

The first concerns interconnection queues. According to grid operators, 80% to 90% of interconnection applications submitted by developers are deficient. The DOE’s Office of Electricity is developing an AI-agentic framework that helps applicants identify and correct errors before submission, potentially allowing interconnection studies to begin up to a year sooner than they otherwise would.

The second example involves grid expansion planning. Brookhaven National Laboratory is building an AI emulator called Grid FM that can accelerate power flow calculations by a factor of 100. Gil described a scenario involving the Texas transmission grid: 2,000 nodes, more than 1,000 potential connection points, 4,000 contingencies, and 10 different 24-hour load scenarios at five-minute increments—a problem that adds up to roughly 10 billion power flow simulations. Using conventional methods, that analysis would take 20 years. With Grid FM, the team expects to complete it in two months.

The Energy-AI Paradox

Gil was candid about the tension at the heart of the current moment. AI is simultaneously one of the most powerful tools for solving energy challenges and one of the largest new sources of electricity demand. The scale has shifted dramatically: where DOE supercomputers once consumed 30 MW to 50 MW, today’s planned AI data centers are measured in gigawatts—with some projects reaching 10 GW.

The path forward, as Gil sees it, involves pursuing multiple strategies in parallel: optimizing the existing grid, adding firm generation capacity, enabling behind-the-meter generation for data centers, accelerating a nuclear energy renaissance, and investing in fusion for the longer horizon.

On the AI side, he emphasized the enormous room for efficiency gains. The human brain, he noted, manages remarkable feats of intelligence while dissipating roughly 20 W—about what a small light bulb consumes. Current graphics processing unit (GPU)–based systems operate at orders of magnitude higher power consumption for comparable tasks. That gap, he argued, signals a long runway for architectural innovation in AI hardware.

New Supercomputers Are on the Way

The computing infrastructure to support the Genesis Mission is already being built. Through the Genesis Consortium, a partnership of 27 industrial partners, including Nvidia, Oracle, AMD, HPE, and others, the DOE is standing up significant new AI supercomputing clusters at two national laboratories.

At Argonne National Laboratory in Illinois, Nvidia and Oracle are deploying a system with approximately 10,000 state-of-the-art GPUs, expected to be operational this year. At Oak Ridge National Laboratory in Tennessee, AMD and HPE are building a comparably sized cluster, also targeting 2026 operations. Looking further ahead, a 100,000-GPU cluster is planned for Argonne in 2027, which would be the largest science-oriented cluster in the world.

These machines will serve a dual purpose: training surrogate models from the DOE’s vast trove of scientific data, and customizing frontier AI models specifically for science—getting AI, as Gil put it, to do “a great job also with physics and chemistry and materials and biology and engineering.”

Public-Private Partnerships Built on Complementary Strengths

Gil described the Genesis Consortium’s philosophy as a straightforward pitch to each stakeholder, which includes federal agencies, state governments, the private sector, universities, and philanthropies. The question the DOE is asking stakeholders is: Do you believe this computing revolution will transform science and engineering? If so, co-invest and bring your strengths.

The response, he said, has been strong. Beyond the large technology companies, startups focused on AI for science, such as Periodic Labs, Radical AI, and the Jeff Bezos–backed Prometheus Project, have joined the effort. The alignment works because each party brings something the others lack. National laboratories contribute domain expertise, unique scientific datasets, and one-of-a-kind facilities like particle accelerators, X-ray sources, and telescopes—assets the private sector simply cannot replicate. Industry brings frontier AI models, computational scale, and speed. Universities contribute foundational research and the next generation of talent.

What Does Success Look Like? ‘50 to 100 AlphaFold Examples’

When asked how the mission will know it has succeeded—given that, unlike the Manhattan Project or Apollo, it doesn’t have a single binary goal—Gil anchored his answer in the story of AlphaFold. To set the stage, Gil noted that in 1971 Brookhaven National Laboratory began cataloging three-dimensional protein structures. After 50 years of painstaking experimental and computational work, the Protein Data Bank held 200,000 structures. Then, he said, AlphaFold, an AI system developed by Google DeepMind, trained on that dataset and predicted the structures of 200 million proteins in just two years.

Success for the Genesis Mission, Gil said, will mean producing 50 to 100 comparable breakthroughs across all domains of science within three to five years. It will mean leaving behind a durable platform of AI supercomputers and next-generation quantum computers available to the scientific community. It will mean AI that is demonstrably excellent at physics, chemistry, materials science, and engineering, not just language and code. And it will mean a generation of graduates who are fluent in both their scientific discipline and the AI tools that augment it.

Gil closed with an analogy. In the 1970s, connecting a few computers with a new protocol called TCP/IP may have sounded somewhat inconsequential to the general public. However, what was actually being built was the internet, a platform that has transformed the world. The Genesis Mission, he suggested, is building something like “an internet of science.” It’s an intelligence layer connecting all the scientific instruments, laboratories, and universities into a seamless ecosystem for discovery.

A Sense of Mission

Asked about the biggest difference between running IBM Research and leading a government initiative spanning 17 national labs and 40,000 scientists, Gil didn’t hesitate. His favorite part of the DOE, he said, is the sense of mission. Everyone he encounters—federal employees and lab partners alike—is driven by purpose, which includes delivering affordable, reliable, and secure energy; advancing discovery in the physical sciences; and supporting national security.

“I don’t encounter anybody that approaches their work with cynicism or pessimism,” he said. “They are passionate about solving these missions on behalf of their fellow citizens.”

His parting words to the audience were characteristically direct: “We ain’t seen nothing yet.”

To hear the full interview with Gil, listen to The POWER Podcast. Click on the SoundCloud player below to listen in your browser now or use the following links to reach the show page on your favorite podcast platform:

For more power podcasts, visit The POWER Podcast archives.

Aaron Larson is POWER’s executive editor (@AaronL_Power, @POWERmagazine).