March 22, 2021 – Over the past decade, a coding project born out of the computer science field of Lawrence Berkeley National Laboratory has helped advance the field of cosmology and prepare it for the era of exascale computing.
Nyx – an adaptive mesh, a massively parallel cosmological simulation code designed to help study the universe at its greatest levels – has become an essential tool for research into some of its smallest and most detailed features, allowing for critical breakthroughs in understanding the dark. matter, dark energy and the intergalactic medium.
Nyx dates back to 2010, when Peter Nugent, head of the computer science department at the Computational Research Division (CRD) at Berkeley Lab, approached CRD senior scientist Ann Almgren about adapting the Castro code, a Adaptive mesh astrophysics simulation tool based on the AMReX software framework for cosmology. The pair hatched a plan to create adaptive mesh code that would be able to represent dark matter as particles that could interact with hydrogen gas and also capture the expansion factor of the universe.
“It literally started with a conversation,” recalled Almgren, now group leader of CRD’s Center for Computer Science and Engineering. “I still remember when Peter first brought up the idea. The collaboration started with “hey, can you do that”? “
In 2011, funding from the Laboratory-Led Research and Development (LDRD) program enabled the team to begin work on the creation of Nyx. Berkeley Lab’s LDRD program is designed to incubate emerging early stage lab projects, providing a bridge between the concept and large-scale Department of Energy (DOE) funded projects.
Among the first members of the Nyx team was Computational Cosmology Center scientist Zarija Lukic, who took charge of creating the physical simulation elements of Nyx. Among other things, Lukic would help write the 2013 article that introduced Nyx to the scientific community and lead the code in the direction of studies of the intergalactic medium and the Lyman alpha forest. Soon after, Nyx transitioned from the LDRD program to the DOE’s Scientific Discovery through Advanced Computing (SciDAC) program, which links research efforts for scientific applications with high performance computing (HPC) technology.
Nyx began to deliver immediate results, and one of the biggest benefits of code became evident: scalability. From its earliest days, Nyx was designed to take advantage of all types and scales of hardware on its host machine, and Nyx simulations have proven to be crucial in enabling cosmologists to produce models of the universe at an unprecedented scale. Over time, this has allowed researchers to get the most out of the supercomputers that house it – from CPU-only systems to heterogeneous systems containing CPUs and GPUs.
“The most important thing is our ability to evolve,” Nugent said. “Because we can take advantage of the entire machine, processor, or GPU, we can take up a very large memory footprint and perform the largest of these types of simulations in terms of universe size at resolutions. the highest. “
Explore the Lyman-alpha forest
One of the first large-scale applications of Nyx involved studies of the Lyman-alpha forest, which remains the primary area of application of the code. It’s made up of a series of absorption lines created when light from distant quasars far outside the Milky Way travels billions of light years toward us, passing through the gas residing between galaxies. By examining the light spectrum of the forest and the distortions caused as this light travels the vast distances to Earth, cosmologists can map the structure of intergalactic gas to better understand what the universe is made of and what it looked like. universe after the Grand Slam. Perhaps most interestingly, distortions in the light spectrum, as will be observed with the Dark Energy Spectroscopic Instrument (DESI) and high-resolution spectrographs like the one mounted on the Keck Telescope, can provide insight into nature. dark matter and neutrinos.
But forest simulations pose a huge computational challenge, as they require recreating massive sectors of space – in some cases up to 500 million light-years in diameter – while also being able to calculate the behavior of small fluctuations in density as light travels through intergalactic space. way.
Enter Nyx. Adaptive Mesh Refinement (AMR) allows a computer to determine for which part of the universe detailed calculations need to be performed and where more general and coarse results are sufficiently precise. This reduces the number of calculations and memory required, as well as the calculation time for complex and large simulations. Using components from AMReX, the code is able to evolve to model the vast volumes probed by the Lyman-alpha forest.
“In 2014 and 2015, we were doing simulations that are still state of the art today,” Lukic said.
Another key aspect of Nyx’s popularity is that it is open source, which has been essential in building a larger community for code outside of Berkeley Lab. Today, research teams everywhere are finding new applications for Nyx, using the code for smaller-scale simulations and experiments. In some cases Nyx is used as is, and in other cases the source code is modified by these researchers to meet their own needs.
“People have used it to make unique galaxy simulations,” Nugent said. “People used it to do simulations much earlier in the universe and later in the universe.”
Ready for the next generation
As the scientific community prepares to enter the exascale computing age, Nyx shows no signs of slacking off. Ongoing code development is supported by the DOE Exascale Computing Project, and Nyx is expected to play a key supporting role in the highly anticipated DESI experiment, performing simulations to support DESI’s observations on the role played dark matter in supporting the expansion of the universe.
Even with the next generation supercomputers that will be used for DESI, the ability of the Nyx code to get the most out of the hardware will be crucial to perform accurate simulations to verify the results. Postdoctoral researcher Jean Sexton has spent much of the past year ensuring Nyx will continue to be on the cutting edge of technology and ready to tackle the next round of problems.
“If you don’t have good efficiency, scalability, and physical accuracy, you won’t be able to produce the simulations needed to get an accurate representation of the data,” Lukic said. “You won’t be able to extract scientific conclusions from future sky surveys. “
Nyx is also expected, literally, to be at the forefront of the Berkeley lab’s last supercomputer, Perlmutter, which will be located at the National Center for Scientific Computing for Energy Research (NERSC). When unveiled this year, Perlmutter will feature artwork generated by a Nyx simulation illustrating the filaments that connect large clusters of galaxies. Nyx code will also likely be important inside Perlmutter and other next-gen supercomputers, including those in the exascale.
Ultimately, Nyx will remain a shining example of how Berkeley Lab is able to develop a project from its inception, through the LDRD program, into DOE funding, and finally to release it for the wider scientific community. In the space of 10 years, the Nyx code has grown from a conversation between laboratory staff members to a mainstay in the field of cosmology and a key part of the next generation of high performance computing and computing systems. research on how the universe works. For Almgren, who was there from the start, Nyx underlines one of the Lab’s greatest strengths.
“I think that’s one of the things the lab does well – it allows people to do collaborations that advance science much more effectively,” she said.
The NERSC is a user facility of the United States Department of Energy.
Source: Shaun Nichols, Lawrence Berkeley National Laboratory