Newswise – A U.S. mission to land astronauts on the surface of Mars will be unlike any other extraterrestrial landing ever conducted by NASA.

Although the space agency has successfully landed nine robotic missions on Mars since its first surface missions in 1976 with the Viking Project, safely transporting people to Mars requires new technologies for flight through the Martian atmosphere. However, these technologies and systems cannot be comprehensively tested on Earth beforehand.

Since 2019, a team of NASA scientists and their partners have been using NASAs FUN3D Software on supercomputers at the Department of Energy’s Oak Ridge Leadership Computing Facility (OLCF) to perform computational fluid dynamics (CFD) simulations of a human-scale Mars lander. The OLCF is a DOE Office of Science user facility located at DOE’s Oak Ridge National Laboratory.

The team’s ongoing research project is a first step in determining how to safely land a vehicle with humans on board on the surface of Mars.

“Naturally, we do not have any validation data for this. We can conduct valuable but limited testing in ground facilities such as a wind tunnel or at a ballistic airfield, but such approaches cannot fully capture the physics found on Mars. We can’t conduct flight tests in the actual Martian environment – when we get there, it’s all or nothing. That’s why supercomputing is critical,” said Eric Nielsen, senior research scientist at NASA’s Langley Research Center and principal investigator of the five-year effort at OLCF.

Unlike recent Mars missions, parachutes are not part of the operation. Instead, the main candidate for landing humans on Mars is retropropulsion – firing forward-facing rockets built into the spacecraft’s heat shield for deceleration.

“We have never flown anything like this before. The fundamental question from the beginning was, “Can we safely control this vehicle?” Nielsen said.

The reason NASA is studying retropropulsion instead of conventional parachutes is a matter of physics. Previous Mars landers weighed about 1 ton; A vehicle carrying astronauts and all their life support systems will weigh 20 to 50 times more, or about the size of a two-story house. Mars’ thin atmosphere – about 100 times less dense than Earth’s – would not allow a parachute landing for such a large spacecraft.

“With a conventional vehicle we fly through a very clean, predictable environment. “All of that is lost with this concept because we will be traveling through an extremely dynamic environment consisting of high-energy rocket exhaust,” said NASA team member and CFD expert Gabriel Nastac.

With guidance from NASA mission planners, the team formulated a multiyear plan consisting of increasingly sophisticated simulations aimed at the key question of controllability.

In 2019, the team ran CFD simulations on the Summit supercomputer at resolutions of up to 10 billion elements to characterize static vehicle aerodynamics at expected throttle settings and flight speeds from Mach 2.5 to Mach 0.8, conditions that require the vehicle’s rocket engines for initial deceleration .

During 2020, intensive code development work focused on porting FUN3D’s general reactive gas capabilities to Summit’s graphics processor or GPU accelerators.

“Achieving the efficient performance of an unstructured grid CFD solver in the face of complex physics-laden kernels is an enormous challenge in a GPU-based computing environment. But ultimately we were able to restructure critical code segments to deliver the performance we wanted,” said NASA research computer scientist Aaron Walden, who leads the team’s multiarchitecture software development.

The work paved the way for a major campaign in 2021 that allowed the team to address the complex interactions of liquid oxygen/methane rocket engines with the Martian atmosphere, which is primarily composed of carbon dioxide and nitrogen. One petabyte (equivalent to 1,000 terabytes) of output data for each simulation, run with 15,000 to 20,000 GPUs on Summit, provided key insights into critical differences in vehicle aerodynamics compared to those obtained using the perfect gas assumption in the previous simulation were observed.

For the 2022 campaign, the team took a major step forward by integrating NASA’s cutting-edge flight mechanics software, known as Program to Optimize Simulated Trajectories II, or POST2, into the workflow. The team went beyond simulations that assumed a static flight condition and now attempted to “fly” the vehicle in the virtual supercomputing environment. This test would be a first attempt to quantify and address critical unsteady dynamics that would occur during an actual descent to the Martian surface.

The team enlisted key experts from Georgia Tech’s Aerospace Systems Design Laboratory. This group was led by Brad Robertson. These experts had already spent several years developing a coupling algorithm to replace the low-order aerodynamic models in POST2 with physics-based real-time FUN3D simulations, ultimately realizing high-precision trajectory simulations that leverage sophisticated flight control algorithms.

“Coupling FUN3D and POST2 was quite a challenge. We had to juggle five or six reference frames and the data transformations between them. But the reward was that we were able to take all of the hard work of other NASA engineers on detailed guidance, navigation, control and propulsion models and bring them all together into a single, unified multiphysics simulation,” said team member Zach Ernst of Georgia At the time, he was a graduate student in engineering and worked with NASA graduate Hayden Dean on the project.

The integration of POST2 presented an additional challenge. Because POST2 is subject to stricter export control regulations than FUN3D, team member Kevin Jacobson was tasked with developing a remote coupling paradigm in which POST2 would run on a NASA facility while simultaneously communicating in real time with FUN3D running at the executive level at OLCF. Establishing and maintaining this connection while dealing with firewalls, network disruptions, and job schedulers presented numerous challenges. This work required approximately a year of planning and coordination with cybersecurity personnel and system administrators at both facilities.

The extra effort paid off when the team achieved their long-term goal of flying a significant portion of the descent phase in the virtual environment.

The arrival of the OLCFs Border The supercomputer couldn’t have come at a better time for the project. With Exascale computing With computing power (a trillion or more calculations per second) now a reality, the team could afford to reintroduce the desired physical modeling and other insights gained over the course of the project. In 2023, the team turned its attention to the ultimate simulation it had hoped for years earlier: a truly autonomous, closed-loop test flight using the world’s most powerful supercomputing system.

While the eight main engines serve to control pitch (up and down rotation) and yaw (side rotation) while the guidance system aims at the intended landing zone, POST2 also issues commands to instruct FUN3D to periodically fire four engines Reaction Control System (RCS), Modules located around the rear of the lander to perform roll corrections in flight.

“These capabilities will be critical for assessing the controllability of future vehicles,” said Alex Hickey of Georgia Tech, who led the development of the RCS modeling.

The team’s long-term goal became a reality in late 2023 when OLCF staff helped coordinate a careful sequence of high-priority missions over a two-week period at scale on Frontier.

“For the first time we were able to return to the original question of safely controlling this type of vehicle in autonomous flight,” said Nielsen. “In a typical aerospace CFD simulation, you might calculate a second or two of physical time. Here, Frontier gave us a successful 35-second controlled flight, descending from an altitude of 8 kilometers (about 5 miles) to about 1 kilometer (0.6 miles) as the vehicle approached its landing phase.

“The resolution, physical modeling and time duration exceed anything we could achieve with a traditional high-performance computing system,” Nielsen added. “The sheer speed of GPUs being implemented at the executive level is truly promising, and we are deeply grateful for the many opportunities and world-class expertise that the OLCF has provided.”

UT-Battelle manages ORNL for the DOE’s Office of Science, the single largest funder of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *