The parable of "The Blind Men and the Elephant" tells of a group of blind men encountering an elephant for the first time and trying to imagine what it looks like purely from touch. Each man feels just one part - the trunk, the ears, a tusk - and attempts to describe it, but none of them can fully comprehend the immensity and complexity of the animal based on their individual knowledge.
Designing and building a fusion reactor operates much in the same way, only instead of tusks, ears and trunks, fusion researchers and engineers are dealing with massive magnetic coils, incredibly powerful heating and cooling systems, and plasma hotter than the core of the sun.
"Fusion is such an integrated, multidisciplinary problem, so you have to take it piecewise," said Cami Collins, interim head of the Burning Plasma Foundations Section in the Fusion Energy Division at the Department of Energy's Oak Ridge National Laboratory.
Research institutions across the country and around the world are tackling the fusion problem from the inside-out, from modeling the plasma within a reactor to engineering the materials, components and subsystems responsible for creating and capturing the energy released by a fusion reaction.
These individual projects and facilities are helping to shrink the knowledge gap and bring fusion energy closer to reality, but they can only go so far. Certain questions can only be answered by assembling a full-scale fusion pilot plant; however, they are expensive, take a long time to build and are not guaranteed to operate as intended given their experimental nature.
"The way that the U.S. fusion program is structured, it's really hard to build just one single facility that's going to do it all, so we're approaching this by building individual facilities that tackle different systems of the fusion pilot plant," Collins said.
Modeling and simulation can attempt to "glue" these pieces together, but even then, many of the models used today were developed in isolation over time and cannot fully simulate the size and interconnected nature of a complete fusion device. The holy grail for engineers and reactor designers is a digital twin - a virtual doppelganger of a reactor that simulates every component and subsystem quickly and accurately - that can guide the design and operation of a future fusion pilot plant.
A multidisciplinary and multi-institution team, led by ORNL, is attempting to create this digital twin through the Fusion REactor Design and Assessment, or FREDA, project. FREDA is a unified, modular framework for designing and simulating whole fusion facilities faster and more thoroughly than ever before. The tool will utilize the best modeling and simulation tools, advanced machine learning methods and high-performance computing to optimize and accelerate the fusion power plant design process.
Collins, who leads the team, said FREDA will be the first fusion modeling tool to connect plasma and engineering models in a self-consistent, modular fashion to perform multiphysics, multi-fidelity analyses of reactor designs using scaled high-performance computational resources. The tool will act as an "umbrella" of sorts, bringing together simulation codes that have already been made into a unified framework so they can easily communicate with others and be swapped in and out as needed to perform simulations at different scales and resolutions.
The team is comprised of fusion scientists from both the plasma physics and engineering sides, as well as nuclear fission engineers and mathematics and computational science domain experts, who are all taking advantage of the investments that have already been made in their respective communities and bringing them together. The team is building on the work of many wise people who came before them, Collins said, and leaning on the fission community's experience with multiphysics engineering simulations to help fill in the modeling gaps.
Kate Borowiec, a system and data analytics engineer in ORNL's Nuclear Energy and Fuel Cycle Division, is working on adapting fission codes for fusion applications. Though the conditions inside a fusion reactor are much more extreme than our modern fission reactors, she said, there is still a bit of overlap between the two.
"The models of the systems outside of the plasma, like structural mechanics, computational fluid dynamics, and neutronics analysis, can be quite similar between fusion and fission," Borowiec said. "We can take all the important knowledge we've gained from existing fission systems that have been operating for decades and apply it to fusion systems."
Having all these models under one umbrella framework has three main benefits. The first is that it allows the model designers to identify issues with different models that would not be apparent when developing them in isolation. Secondly, the comprehensive simulations generated by the tool will help identify the meaningful experiments to do at real-world individual test facilities that will provide useful data, and, in turn, help further improve the simulations.
The final benefit is that it will help assess the safety of future devices before they are built. The first fusion plant will be unknown territory, so being able to examine every aspect of the device, such as plasma temperature, radiation transport, and mechanical stress on the magnetic coils, will ensure the device will operate safely and reliably for its lifetime.
Together, these capabilities are meant to accelerate the reactor design process, which has been a time, labor and computationally intensive process that has slowed down fusion reactor development. Traditional reactor design studies started with a single design point, such as plasma geometry or blanket concept, that would rely on humans to draft custom computer-aided design, or CAD, geometry, which could take weeks or months to complete. The feasibility was then tested using high-fidelity simulations, and if the design was not feasible or the components incompatible, it went back to the engineers for modification, and the process repeated.
Instead of going through this laborious process - called conceptual design - every time a design point needs to be tweaked, the FREDA framework will rely on parametric design, where aspects of the CAD are represented by variables that describe the elements within the reactor, like the thickness of the walls, component materials or shape of the plasma. The iteration process catches issues early in the design process and allows engineers to quickly and simply address the problem area by adjusting a few parameters, rather than manually changing the design within the software. It will also allow designers to use lower fidelity models when testing out a new design point that will save time and computing power from the start.
"The modularity of it allows us to plug in different workflows or types of models," Collins said. "That way, you can use the appropriate fidelity level at the right time and catch issues early, so you're not doing a long timescale, expensive simulation when all you needed was a simple calculation to catch it."
FREDA will also enable engineers and designers to optimize individual components or subsystems within a reactor thanks to an optimization algorithm powered by machine learning and artificial intelligence on high-performance computers. Users can indicate what parameters they want to optimize, and FREDA will rapidly test many cases, adjusting the parameters of the design until it finds the design with the lowest complexity and cost.
Ultimately, the goal of FREDA is to create a tool for research organizations and private companies who want to design and build fusion devices for power generation or other scientific purposes. These groups want to make sure that their investments will pay off and the device will work, so ensuring FREDA will reflect real-world situations is critical to its success. J.M. Park, an ORNL plasma physicist based at the DIII-D National Fusion Facility in San Diego, is tasked with making sure that happens.
"Integrated modeling for the fusion community has a long, long history, but just now our capabilities are evolving very rapidly, and we have made enough progress to actually apply them to reactor design," Park said.
One of his roles in the project is to connect FREDA's integrated modeling to real-world applications to validate the models and utilize that information to estimate the uncertainty of the tool's predictions. Already, FREDA has been used to test upgrades to the heating system in the DIII-D tokamak and is currently helping to explore new wall materials. These individual applications may be small when compared to simulating an entire reactor, but the data they produce is vital to further improve the models.
"This is an iterative process," Park said. "We test our model against experimental data, design a new experiment based on the updated model, then validate again, improve again."
The road ahead
FREDA is currently one year into an ongoing five-year project, with much of 2024 spent tackling two of the biggest obstacles to creating a unified modeling framework: integration and uncertainty quantification.
Coupling plasma and engineering simulations is no easy feat, as these individual modeling tools were developed over the course of decades by different teams at institutions all over the world. There are differences in complexity, fidelity and physics across models, like pieces cobbled together from different puzzles. Rick Archibald, leader of ORNL's Data Analysis and Machine Learning group and data analytics lead for the FASTMath Institute, is one of the computational scientists on the FREDA project responsible for making sure these pieces fit together and the models are consistent across the framework.
When a model-builder wants to examine the activity on a surface or in a volume of plasma, they must generate a mesh, a grid of shapes - thousands to millions of squares, triangles or hexagons - that conforms to the geometry of the subject, and then simulates the physics that occur within each of those shapes. There are always trade-offs when constructing a mesh, though, as smaller grids may be more accurate to the geometry and provide more granularity but are more complex and require more time and computational power to run.
"The simplest way to generate a mesh would just be to make it a uniform mesh and make it as dense as possible, but that would take a lot of computer time," Archibald said. "You want to be a little more clever than that, so you can design your mesh around some properties you know about your simulation, so you have better resolution in areas where you want it and a little bit less resolution in places you don't."
This problem is then compounded when you try to combine the models and enmesh the meshes, as many of these models were not developed to be combined with other models of different resolutions. Typically, these efforts live in silos, meaning that a plasma expert designs a model assuming they know how a wall is going to behave, and a material scientist modeling a wall assumes how the plasma will behave, Archibald said.
"The trick here is what happens when you run them together and replace those assumptions with the actual behaviors," he added. "All of a sudden, those nice solid assumptions get mixed up, and the errors in one confect the errors in the other, and you get all sorts of problems. When you talk about building a whole device, connecting multiple components together and having them all talk to each other, that little problem gets magnified a lot and becomes the key focus of this project."
Grid generation is also a time intensive process that takes a lot of manpower and expertise in both modeling and physics to perform. FREDA will accelerate and automate the entire process thanks to high-performance computing, machine learning, and the wealth of resources provided by decades of fusion modeling research. The tool will use machine learning methods to analyze the meshes that have already been made for different reactor designs and utilize artificial intelligence to generate new meshes that are optimized around a desired design point without human intervention.
"Machine learning is a really fast-paced field, and the new tools that are coming out of that field have helped us do things that we may not have been able to do in the past," Archibald said. "Machine learning methods will look at what scientists in the community have generated in these situations and pull them together so that everything you need to run a simulation can be generated and given to you automatically with no person in the loop."
The other main hurdle to clear is uncertainty quantification. There are many sources of error and uncertainty, depending on the model. Without a full fusion device to validate against, a model can't fully reflect reality, but if you are able to quantify just how much your model is off by, you can design with a certain degree of confidence.
"There are a lot of assumptions we make within the analysis, but if you want to build a device, you have to have a starting point," Borowiec said. "Even though the results you get might be uncertain, you can still rely on them more than just hoping for the best."
When one is modeling a design space, they want to avoid the "cliff's edge" and not optimize a device as far as possible, because if one of the parameters is off, it will no longer work. Instead, you want to utilize "design under uncertainty," Borowiec said, where the uncertainties of a model, once quantified, are incorporated into a less optimized and safer design. So even if the assumptions are off by 5 to 10%, the device will still work.
"If you have uncertainty in the heat fluxes on the first wall of your device and the melting point of that material is 1,000 Kelvin, you don't want to design for that temperature because the heat may exceed that," she said. "Instead, you design it for 800 Kelvin so you have some wiggle room, and it will still be okay if the actual temperature exceeds the one you had planned."
The uncertainty quantification process is ongoing, and as new fusion devices and test stands are built, the experimental data produced will help further improve the models and help FREDA be as accurate and reliable as possible.
"It takes a village."
Given the scale of what FREDA is trying to accomplish, it comes as no surprise that the team behind it is equally large and diverse. Researchers from ORNL's Fusion Energy, Nuclear Energy and Fuel Cycle, and Computer Science and Mathematics divisions are all contributors on the team, as well as personnel from Lawrence Livermore National Laboratory, General Atomics, Sandia National Laboratories and the University of California San Diego.
ORNL is specially suited to lead such an endeavor, though, given its breadth of expertise and unique capabilities in fusion topics, strong history of fission, HPC capabilities and advanced materials program. The cross-cutting nature of the lab was one of the reasons Collins came to ORNL in the first place.
"It really takes a village. The nature of this is much different than a lot of other projects that are more focused on just the plasma or subcomponents," she said. "It brings together these different communities of people, generates diverse thought and helps you to better focus on how you communicate your goals because we have all these tasks running in parallel."
Park echoed the statement, noting the unusual nature of the collaboration and the mingling of the plasma and engineering modeling communities.
"Usually, the people in plasma modeling have the same background, but the engineering and computational sides come from lots of different disciplines, so developing the environment and combining the knowledge bases is a really important aspect," he said. "I don't think any other lab in the world can do this."
A tool such as FREDA is also vital if the United States wants to accelerate the timelines that have been set to deliver fusion power and meet the growing demands for green energy. Given the time, expense and effort it takes to build a system from scratch, it is not feasible to build and iterate on new facilities until they work. Instead, FREDA aims to combine the hard work done by the community with the best tools available to rapidly design, iterate, and automate the creation of the next generation of fusion devices.
"This is an extremely important project because if we want to make progress, we have to work really quickly," Borowiec said. "This software is attempting to do that, and if we can get everything to work, we will have a fully integrated and optimized design assessment tool ready for the public and private sectors to utilize."
FREDA is funded under the Department of Energy's Scientific Discovery through Advanced Computing, or SciDAC, Fusion Energy Sciences Partnerships program. SciDAC partners all six Office of Science programs - Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics and Nuclear Physics - as well as the Office of Nuclear Energy to dramatically accelerate progress in scientific computing that delivers breakthrough scientific results.
The ORNL team includes Cami Collins, Rhea Barnett, Mark Cianciosa, Yashika Ghai, Ehab Hassan, JM Park, Phil Snyder, Gary Staebler from the Fusion Energy Division; Vittorio Badalassi, Jin Whan Bae, Kate Borowiec, Robert Lefebvre, and Arpan Sircar from the Nuclear Energy and Fuel Cycle Division; and Rick Archibald, David Bernholdt, Wael Elwasif, and Ana Gainaru from the Computer Science and Mathematics Division. Other institutional team members are Benjamin Dudson and Jerome Solberg from Lawrence Livermore National Laboratory, Jeff Candy and Orso Meneghini from General Atomics, Michael Eldred from Sandia National Laboratories, and Christopher Hollan from the University of California San Diego.
UT-Battelle manages ORNL for the Department of Energy's Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science . -- Sean Simoneau