The Aurora supercomputer at the U.S. Department of Energy’s Argonne National Laboratory is transforming aircraft design through advanced simulation and artificial intelligence techniques. As one of the first exascale supercomputers, Aurora can perform over a quintillion calculations per second. This capability allows researchers to explore innovative designs for more efficient airplanes, marking a significant advancement in aerospace engineering.
Research led by the University of Colorado Boulder utilizes Aurora’s immense computing power alongside sophisticated machine learning methods to analyze airflow around commercial aircraft. This research aims to enhance the design of next-generation airplanes by providing deeper insights into aerodynamic performance. According to Kenneth Jansen, a professor of aerospace engineering at the University of Colorado, “With Aurora, we’re able to run simulations that are larger than ever before and on more complex flows.” The simulations are crucial for improving predictive models, particularly in understanding the physics of airflow around aircraft components, such as vertical tails and rudder assemblies.
Traditionally, aircraft designs have prioritized worst-case scenarios, such as takeoff in a crosswind or engine failure. Consequently, vertical tails are often larger than necessary, resulting in increased drag and fuel consumption. Riccardo Balin, an assistant computational scientist at Argonne, explained, “The vertical tail on any standard plane is as large as it is precisely because it needs to be able to work effectively in such a situation. The rest of the time, however, that vertical tail is larger than would be necessary.” The team aims to leverage their understanding of airflow physics to design smaller, more efficient tails without compromising safety.
To achieve this, the research team employs Aurora to conduct large-scale fluid dynamics simulations using HONEE, an open-source solver designed for turbulent airflow modeling. These high-fidelity simulations produce training data essential for developing machine-learning-driven subgrid stress (SGS) models, which play a crucial role in turbulence predictions. Improved SGS models can significantly reduce simulation costs while maintaining accuracy, thus minimizing the need for expensive wind tunnel tests.
“We work with airplane manufacturers to make sure that we study the problems most relevant to both current and future designs,” Jansen said. The team’s ambitious goals include enhancing the reliability of simulation tools and capitalizing on upcoming computing advancements to conduct larger, more realistic simulations.
Traditional turbulence models often rely on vast datasets and protracted offline analyses. In contrast, the team is pioneering an “online” machine learning approach that produces training data simultaneously with simulations. This method allows for real-time analysis and data extraction during the simulation process. “Online machine learning refers to carrying out simulations that produce training data at the same time that the actual training task is carried out,” Kris Rowe, another assistant computational scientist at Argonne, noted.
This innovative methodology not only streamlines data usage but also enhances the team’s ability to uncover insights about turbulent airflow. By developing smarter models, the researchers can predict turbulence behavior under challenging conditions where conventional models may fall short. The new approach may also open avenues for testing real-time flow control techniques and evaluating the performance of smaller tail designs under extreme circumstances.
“Turbulence, as a very chaotic, nonlinear system, is itself highly complex,” Rowe stated. Identifying correlations between inputs and outputs in turbulence models is challenging. Consequently, successful turbulence modeling requires extensive physics knowledge and domain expertise. Generating high-quality turbulence data necessitates numerous high-fidelity simulations, which presents a significant challenge even for advanced systems like Aurora.
To facilitate this groundbreaking work, the team uses advanced computational tools, including the SmartSim library. This open-source software allows data to be streamed directly from simulations to machine learning models on the same compute node, eliminating time-consuming data transfers. Balin explained, “SmartSim effectively allows users to create a staging area where they can store data — say, from a simulation — without writing them to the parallel file system.” This collocated approach optimizes resource usage and addresses the massive data flows handled by Aurora.
The team also employs PETSc, an Argonne-developed open-source toolkit, to perform the large-scale numerical calculations necessary for turbulent airflow simulations. By combining Aurora’s exascale capabilities with these innovative tools, the researchers are significantly altering how aircraft are designed and tested in a virtual environment. This advancement promises to accelerate the development of next-generation aircraft while decreasing the reliance on costly physical testing.
The work conducted on the Aurora supercomputer is supported by the Argonne Leadership Computing Facility’s Aurora Early Science Program and the U.S. Department of Energy’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. The ALCF plays a vital role in providing supercomputing resources to the scientific and engineering communities, facilitating fundamental discoveries and advancements across various disciplines.
Argonne National Laboratory, managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science, aims to tackle pressing national challenges in science and technology through innovative research. The Office of Science is the largest supporter of basic research in the physical sciences in the United States, addressing significant challenges of our time. For further details, visit the U.S. Department of Energy’s official website.
