This excess was also found in current cosmological simulations, but the magnitude of this effect was much lower than in current observations. The large discrepancy could mean that massive galaxy groups in the real Universe are younger than previously thought, which challenges the current cosmological model and could provide valuable information about the Hubble stress problem.
Measurements of the cosmic microwave background show that the Hubble constant, which relates a galaxy’s removal rate to its distance, is 67.31 kilometers per second per megaparsec. At the same time, other methods, including those based on measuring the distances to Type Ia supernovae (which are standard candles) in galaxies, indicate a different value of the Hubble constant, from 73.3 to 76.5 kilometers per second per megaparsec. This too large discrepancy, called the Hubble stress, is considered by some scientists to be a sign of a crisis in the standard ΛCDM cosmological model.