Machine learning outpaces supercomputers for simulating galaxy evolution coupled with supernova explosion

Sadie Harley
scientific editor

Robert Egan
associate editor

Researchers have used machine learning to dramatically speed up the processing time when simulating galaxy evolution coupled with supernova explosion. This approach could help us understand the origins of our own galaxy, particularly the elements essential for life in the Milky Way.
The findings are in The Astrophysical Journal.
The team was led by Keiya Hirashima at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan, along with colleagues from the Max Planck Institute for Astrophysics (MPA) and the Flatiron Institute.
Understanding how galaxies form is a central problem for astrophysicists. Although we know that powerful events like supernovae can drive galaxy evolution, we cannot simply look to the night sky and see it happen.
Scientists rely on numerical simulations that are based on large amounts of data collected from telescopes and other devices that measure aspects of interstellar space. Simulations must account for gravity and hydrodynamics, as well as other complex aspects of astrophysical thermochemistry.
On top of this, they must have a high temporal resolution, meaning that the time between each 3D snapshot of the evolving galaxy must be small enough so that critical events are not missed. For example, capturing the initial phase of supernova shell expansion requires a timescale of mere hundreds of years, which is 1,000 times smaller than typical simulations of interstellar space can achieve.
In fact, a typical supercomputer takes one to two years to carry out a simulation of a relatively small galaxy at the proper temporal resolution.
Getting over this timestep bottleneck was the main goal of the new study. By incorporating AI into their data-driven model, the research group was able to match the output of a previously modeled dwarf galaxy but got the result much more quickly.
"When we use our AI model, the simulation is about four times faster than a standard numerical simulation," says Hirashima.
"This corresponds to a reduction of several months to half a year's worth of computation time. Critically, our AI-assisted simulation was able to reproduce the dynamics important for capturing galaxy evolution and matter cycles, including star formation and galaxy outflows."
Like most machine learning models, the researchers' new model is trained using one set of data and then becomes able to predict outcomes based on a new set of data. In this case, the model incorporated a programmed neural network and was trained on 300 simulations of an isolated supernova in a molecular cloud that massed one million of our suns.
After training, the model could predict the density, temperature, and 3D velocities of gas 100,000 years after a supernova explosion. Compared with direct numerical simulations such as those performed by supercomputers, the new model yielded similar structures and star formation history but took four times less time to compute.
According to Hirashima, "our AI-assisted framework will allow high-resolution star-by-star simulations of heavy galaxies, such as the Milky Way, with the goal of predicting the origin of the solar system and the elements essential for the birth of life."
Currently, the lab is using the new framework to run a Milky Way-sized galaxy simulation.
More information: Keiya 島敬也 Hirashima 平 et al, ASURA-FDPS-ML: Star-by-star Galaxy Simulations Accelerated by Surrogate Modeling for Supernova Feedback, The Astrophysical Journal (2025).
Journal information: Astrophysical Journal
Provided by RIKEN