fisica e...
missioni spaziali



The simulated Universe

versione italiana

The evolution of Universe can be simulated using supercomputers, capable of performing billions of operations per second and storing data for millions of billions of bytes (how many could contain thousands of CD-ROM). The huge amount of data from astronomical observations can be analyzed to recreate the evolution of the Universe, from Big Bang to the present.
Technological innovations have contributed to making progress in these studies. The data of astronomical observations are valuable because light travels at a large but finite speed (about 300,000 km/s), so when we observe a celestial body we are looking back into time and seeing that object as it was in the past: we see them literally as they were in the distant past. The computer simulations allow to link the astronomical data for different times to reconstruct the evolution of the Universe. Evolution is simulated starting from the conditions that are assumed to have been that of early Universe and then applying laws of physics. The assumptions made are tested by means of comparison between the simulated and real Universe, and possibly modified. Because of enormous size, for a complete interpretation of astronomical data you need a computing power capable of performing algorithms that take into account all the interactions and effects on the process.
This type of work becomes effective if the simulated evolution of the Universe can be compared with the real evolution of the Universe. This is possible only if we have telescopes capable of observing the Universe of billions of years ago, i.e. observing celestial bodies very little bright. To reconstruct the structure of the Universe of billions of years ago we need to know how galaxies were distributed in space. The most powerful telescopes can be traced back to the distance of galaxies by measuring the shift towards longer wavelengths of components of the radiation emitted: the greater the shift, the greater the distance and the lower the brightness of galaxies (i.e. galaxies are very old). Combining the distance of galaxies with their position in the sky you get a three-dimensional map of their spatial distribution.
Even images from telescopes, such as the Hubble Space Telescope, are reconstruction through complex algorithms of bits collected (i.e. essentially strings of zeros and ones and a combination between them). Only a part of the large amount of data that arrive contain the 'exposure' which once processed become images. These, initially in black and white, are artificially colored, depending on the chemical elements detected by spectrometers. The black and white images are the raw material to work; from these images are formed usually three pictures with the basic colors (red, green, blue), which are then combined to produce the intermediate shades. This development is quite long, so as to require a few months time. With the staining, we discover hidden details. This type of work does not require superprocessing, but computers and commercial software.