The universe… and all the rest
As part of an international consortium, astronomers from the Max Planck Institute for Astrophysics and their colleagues used their T3E supercomputer to perform the largest simulation of its kind. They studied how a large part of the entire observable universe - the Hubble space - which has evolved matter into a complex web of walls and filaments discovered in recent mapping of the galaxies. The improved computing speed and storage capacity of computers has made it possible to simulate the growth of cosmic structures in ever greater detail and over ever larger spatial areas. In recent weeks, an international consortium of astronomers has been using the parallel mainframe CRAY T3E with a 688 processor in the computer center of the Max Planck Society in Garching to model the processes.
Our cosmos was very smooth at the times we observe when measuring the cosmic microwave background radiation. The faint pattern seen in the intensity distribution of the radiation is consistent with the expansion model for the evolution of the Universe in the first moments after the Big Bang. The scientists believe that the evolution of the granular structure from the smooth state is due to gravity. In their opinion, a large part of the matter responsible for this is present in a "dark" form that has not yet been defined in more detail. The currently most common idea is that this matter consists of free elementary particles of a type that has never been found on earth. Due to their gravitational force, the particles of this so-called cold dark matter (CDM) cause the formation of galaxies, and thus indirectly of stars, planets and people.
Dark matter is only affected by gravity. Therefore, a computer can be programmed to track the movements of matter as the universe expands, getting older and clumpier. In practice, it is difficult to simulate the growth of clumps and empty space in a very large room. The computer has to follow extremely large numbers of dark matter particles as they develop in order to represent all formations faithfully. Calculating Hubble space, he simulated the evolution of a billion dark matter particles from the epoch left behind by background radiation to the present day. The starting point was the near-uniform state assumed by the inflationary universe theory. The aim of the simulation was to clarify whether the distribution of matter in the current universe that is generated according to these theories agrees with the patterns recorded in the largest existing (and planned) maps of the distribution of galaxies.
The calculations used the entire storage capacity of the T3E in Garching, one of the 10 most powerful mainframe computers in the world. A year of preparation was required to adapt the computer programs to a parallel computer, followed by months of work to maximize the efficiency with which the device stores and manipulates data. Within just 72 hours, the computer then produced almost a terabyte of output data through its calculations - this corresponds to about 100 numbers for every person on earth and is enough to fill 800 CD-ROMs. Handling such massive amounts of data is inherently difficult, and manipulating and producing images from the results of the Hubble space simulation has become one of the greatest challenges for computers.
Setting up, running and analyzing such a simulation requires the effort of many people, and the UK-based Virgo Consortium (the group responsible for the computation in Garching) includes scientists from four countries. Canadian and British scientists had written the original computer programs. These were completely restructured for the massive parallel computer CRAY T3E and adapted to the "Hubble space" problem in the computer center in Garching. Scientists from the Max Planck Institute for Astrophysics were responsible for the calculation and the output data. An American coordinated the overall setup of the experiment. The results will be studied in all four countries as well as in France.
The analysis of the Hubble space simulation is still at an early stage, but already has some impressive results. Cross-sections through the distribution of matter show patterns similar to those found on the bottom of a sunlit pool. Visible are walls and filaments of dark matter so large that they are unlikely to be detected by the largest galaxy mapping projects. The most extensive areas of low density are so vast that vast patches of empty space in the galaxy breakdown are sure to emerge in the next generation of mapping projects. The most massive dark matter clusters contain several times as much matter as the largest known galaxy clusters, suggesting that the discovery of the true "giant clusters" is likely yet to come.
Today's computers are so powerful that calculations such as the simulation described can produce images of artificial universes with a level of detail that corresponds to a galaxy survey of the entire visible universe. Of course, only a comparison with a real survey can show whether these images - and with them the physical hypotheses based on them - represent reality truthfully.
Images of cross sections of the simulated Universe can be found on the website of the Max Planck Institute for Astrophysics.
The Heidelberger Verlag Spektrum der Wissenschaft is the operator of this portal. Its online and print magazines, including "Spektrum der Wissenschaft", "Gehirn&Geist" and "Spektrum – Die Woche", report on current research findings.