Artificial intelligence helped simulate the evolution of galaxies really fast

Simulating the evolution of galaxies when you need to obtain a picture of the entire system at relatively short intervals is an incredibly tedious task, even if you use a supercomputer. However, artificial intelligence in the form of a machine learning algorithm can help with this.

Galaxy simulation. Source: sketchfab.com

Simulation of the galaxy’s evolution

Researchers used machine learning to significantly speed up data processing time when simulating the evolution of galaxies in combination with supernova explosions. This approach can help us understand the origin of our galaxy, in particular the elements necessary for life in the Milky Way.

The team was led by Keiya Hirashima from the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan, together with colleagues from the Max Planck Institute for Astrophysics (MPA) and the Flatiron Institute.

Understanding how galaxies form is a central problem for astrophysicists. Although we know that powerful events such as supernovae can stimulate the evolution of galaxies, we cannot simply look at the night sky and see how this happens.

Conditions for successful simulation

Scientists rely on numerical simulations based on large amounts of data collected using telescopes and other instruments that measure the parameters of interstellar space. Simulations have to take into account gravity and hydrodynamics, as well as other complex aspects of astrophysical thermochemistry.

In addition, they should have high temporal resolution, which means that the time between each 3D image of an evolving galaxy should be short enough to not miss critical events. For example, to capture the initial phase of supernova shell expansion, a time scale of only hundreds of years is required, which is 1,000 times less than can be achieved by typical interstellar space simulations.

In fact, a typical supercomputer requires one to two years to perform a simulation of a relatively small galaxy with adequate temporal resolution.

Overcoming temporary resolution

Overcoming this narrow point in the temporal step was the main goal of the new study. By integrating artificial intelligence into their data-driven model, the research team was able to obtain results similar to those of the previous dwarf galaxy model, but much faster.

“When we use our artificial intelligence model, the simulation runs about four times faster than a standard numerical simulation,” says Hirashima.

“This corresponds to a reduction in calculation time of several months to half a year. Critically, our AI-assisted simulation was able to reproduce the dynamics important for capturing galaxy evolution and matter cycles, including star formation and galaxy outflows.”

Neural network

Like most machine learning models, the researchers’ new model is trained on one set of data and then becomes capable of predicting results based on a new set of data. In this case, the model included a programmed neural network and was trained on 300 simulations of an isolated supernova in a molecular cloud with a mass equal to one million of our suns.

After training, the model was able to predict the density, temperature, and velocity of the gas 100,000 years after the supernova explosion. Compared to direct numerical simulations, like the ones done by supercomputers, the new model gave similar structures and star formation histories, but it took four times less time to run. 

The laboratory is now using a new system to simulate a galaxy the size of the Milky Way.

According to phys.org

Advertising