arXiv:astro-ph/0405454AbstractReferencesReviewsResources
The law of evolution of energy density in the universe imposed by quantum cosmology and its consequences
V. E. Kuzmichev, V. V. Kuzmichev
Published 2004-05-24Version 1
The quantum model of homogeneous and isotropic universe filled with the uniform scalar field is considered. This model predicts effective inverse square-law dependence of the mean total energy density <\rho> on the expectation value of cosmological scale factor <a> where the averaging is performed over the state with large quantum numbers. Such a law of decreasing of <\rho> during the expansion of the universe allows to describe the observed coordinate distances to type Ia supernovae and radio galaxies in the redshift interval z = 0.01 - 1.8. A comparison with phenomenological models with the cosmological constant (\Lambda CDM) and with zero dark energy component (\Omega_{M} = 1) is made. It is shown that observed small deviations of the coordinate distances to some sources from the predictions of above mentioned simple quantum model can be explained by the fluctuations \delta a of the scale factor about the average value <a>. These fluctuations can arise due to finite widths of quasistationary states in the early universe. During expansion the fluctuations \delta a grow with time and manifest themselves in the form of observed relative increase or decrease of coordinate distances. The amplitudes of fluctuations \delta a/<a> calculated from observed positions of individual supernovae are in good agreement with their estimations in quantum theory. Possible consequences from the conclusions of quantum theory are discussed.