Photo: Deng Xiaoci/ GT

Photo: Deng Xiaoci/ GT

An international research team led by Chinese scientists unveiled the first batch of findings of the largest-ever cosmological simulation to date, codenamed “HyperMillenium,” on Thursday. Such development has been applauded by global heavy-weight scientists as breakthrough ushering in a new era for human study of the universe, Global Times reporters learned from a press conference held by the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC) on Thursday. 

Wang Qiao, a fellow researcher with the NAOC, presented the simulation at the press conference, explaining that after the Big Bang, the universe evolved from an extremely homogeneous state and gradually developed into a web-like structure. In the “HyperMillenium” simulation, the research team utilized 4.2 trillion virtual particles to simulate the formation and evolution of the entire cosmic structure across the 13.8-billion-year timescale of the universe.

Photo: courtesy of the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC)

Photo: courtesy of the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC)

According to a press release provided by the NAOC, the simulation covers a cube with a side length of 12 billion light-years and uses 4.2 trillion virtual dark matter particles. By applying a technique called N-body numerical simulation, the team accurately recreated how large-scale structures in the universe evolved over 10 billion years. In simple terms, they built a virtual universe inside a supercomputer, starting from just after the Big Bang and following the force of gravity step by step, read the release.

This provides theoretical support for research into dark matter and dark energy, and also offers strong support for new-generation galaxy survey programs, such as China Space Station Telescope (CSST) and the European Space Agency’s Euclid mission, according to the NAOC. 

Chinese domestically developed super computers and self-developed software, called photoNs, played an important role in running this large-scale simulation. After more than 10 years of work on algorithms and optimization, the team achieved efficient calculations using over 10 thousand accelerator cards. The project consumed more than 100 million CPU core-hours and 10 million accelerator-card hours, and produced approximately 13 petabytes of raw and processed data, per the NAOC.

We are entering an era where surveys of enormous cosmological volumes have the potential to revolutionize our understanding of dark energy, cosmological inflation, and the properties of neutrinos, said Mike Boylan-Kolchin of the University of Texas at Austin, the US. The professor hailed the simulation a “computational marvel.” 

“For this to happen, we need advanced theoretical tools, and the HyperMillennium Simulation is a computational marvel that will help unlock fundamental physics from observations of the cosmos. It has an unprecedented range of volume and mass resolution, enabling detailed predictions about how huge numbers of relatively common galaxies are distributed across the cosmic web and the properties of inherently rare and interesting objects that are inaccessible with smaller volumes. The HyperMillennium Simulation will be a touchstone for the galaxy formation and cosmology communities for years to come,” the professor said. 

“The HyperMillennium simulation redefines what is nowadays possible in numerical cosmology. I am extremely impressed that the team could realize this incredibly large and highly accurate simulation. Its enormous statistical power allows us to carry out new precision test of the LambdaCDM cosmological model, something that is very important for the field,” said Volker Springel, the director of the Max Planck Institute for Astrophysics in Germany.

The first research paper from the project has recently been published in the journal Monthly Notices of the Royal Astronomical Society. And according to the NAOC, the first batch of simulation data has already been released to the global scientific community through the National Astronomical Data Center, a platform for astronomy research, education and data-driven applications.  

 

Comments are closed.