Nature and Nature’s laws lay hid in night:
God said ‘Let Newton be! ‘ and all was light…
Alexander Pope
…It did not last: the Devil howling ‘Ho!
Let Einstein be,’ restored the status quo.
John Collings Squire
If you catch an astrophysicist-cosmologist somewhere and ask him how fast the Universe is expanding, he will most likely say that the question is incorrect. After all, the answer depends on how deep into space you look. Next, we will probably talk about the galaxy expansion due to the expansion of space, the Hubble-Lemaître law, and then it is not far from discussing the Hubble tension, one of the most pressing problems in modern cosmology. Well, let’s talk about the pain of 21st-century cosmologists.
What exactly did you have that exploded?
The most frequent cosmological questions asked by space enthusiasts go something like this: “Where is the center of the Universe?”, “When did the Big Bang end?”, “How fast is the Universe expanding?”, “How big is the Universe?”, and “What is beyond the Universe?”. Despite the interest in these aspects, numerous models, simulations, and speculations, some of these questions still have no answers, and it is difficult to say whether they ever will.

Source: NASA / WMAP Science Team / Art by Dana Berry
About 100 years ago, thanks to the efforts of Georges Lemaître (1927) and Edwin Hubble (1929), it was discovered that galaxies are, on average, moving away from us, or, as they say, running away. And the speed of the galaxy receding is directly proportional to the distance to it. This seemingly simple law is now known as Hubble’s law, or Hubble-Lemaître law. The proportionality coefficient H0, which it includes, is historically called the Hubble constant and is measured in kilometers per second per megaparsec (km/s/Mps). In other words, it shows how fast a galaxy located at a distance of 1 megaparsec, or 3.26 million light years, is moving away from us.
It is worth noting that the experimental discovery of galaxy accretion was preceded by a theoretical model. In 1922, Alexander Alexandrovich Friedmann found a nonstationary solution to Albert Einstein’s gravitational equation and predicted the expansion of the Universe. If we mentally turn back the clock, we can conclude that all matter was once extremely compactly packed. This was the idea expressed by Lemaître in 1931. A little later, Georgiy Antonovich Gamov, a supporter of Friedmann’s theory, further clarified that the Universe had to be not only compact but also incredibly hot. This is how what was then called the dynamic evolving model of the Universe was formed.

As with any revolutionary ideas, the theory developed by Friedmann, Lemaître, and Gamov was long perceived with great skepticism, and not all astrophysicists believed in it, mostly adhering to the hypothesis of “continuous birth of matter” in the context of the expanding Universe. Nobel laureate Fred Hoyle belonged to this “camp”. Ironically, it was thanks to him that the term “Big Bang” appeared. In his lecture on BBC radio on March 28, 1949, commenting on the failure of the new theory, he said the following: “These theories were based on the hypothesis that all matter in the Universe was created as a result of one big bang at some point in time in the distant past”.
Although not immediately, after a couple of decades, the term “Big Bang” gradually caught on, and now it is used not so much to describe the “beginning of the Universe” as to describe a cosmological theory that describes its extremely rapid expansion from the initial compact hot state. We can still see the consequences of that era in the form of galaxy accretion.
How fast do galaxies run away?
Today, the Big Bang theory in astrophysics is considered generally accepted: it is held by the vast majority of cosmologists, with the exception of a small number of marginalized individuals. But, as is often the case, the devil is in the details. And in this case, it lies directly in determining the specific value of the Hubble constant, a key parameter of the evolution of the Universe.
The value of H0 calculated by Edwin Hubble was ~500 km/s/Mps, while various measurements by modern astrophysicists give a value of about 73 km/s/Mps. So, where does this dramatic difference come from?

Hubble’s main result was to establish a relationship between the velocity of galaxies and the distance to them. Determining the velocity in the direction away from the observer is quite simple. It is calculated from spectra by measuring the redshift (analogous to the Doppler effect). But determining the distances to galaxies was a real problem both 100 years ago and now.
Despite all the sophistication of modern astronomical equipment, there are no direct, reliable methods for determining how distant a galaxy is. Therefore, at large distances, astrophysicists use so-called “standard candles”, i.e., objects in these galaxies whose brightness is well known. These mostly include type Ia supernovae (an explosion of a white dwarf as a result of the accumulation of a critical amount of matter from a companion star) and cepheids, although sometimes they also rely on other stellar objects.
Yes, Edwin Hubble used cepheids. This is a class of bright pulsating supergiant stars (sometimes called giants) that astronomers are very fond of because of the relationship between period and luminosity. The period of the pulsations is easy to calculate from observations, and knowing the brightness, you can calculate the distance.

The problem was that there were several systematic errors in the Hubble data. One of them was that it was not yet known that cepheids come in different subtypes. Another error was that Hubble did not take into account interstellar and intergalactic absorption of visible radiation. Yet another was that some variable sources that the scientist thought were isolated cepheids turned out to be entire star clusters. All this led to such a significant reassessment of the Hubble constant by Edwin Hubble himself. More recent studies give H0 values mostly between 70 and 74 km/s/Mpc.
However, it is not so simple. In 1998, Saul Perlmutter, Brian P. Schmidt, and Adam Riess discovered that Type Ia supernovae in distant galaxies appeared to be less bright than expected. This led to the conclusion that the Universe used to expand more slowly, while now it is expanding at an accelerating rate. Something is counteracting gravity, “pushing” the galaxies apart. This mysterious force is now known as dark energy, and the three scientists won Nobel Prizes for their discovery in 2011. At the same time, it turned out that the Hubble constant is not constant at all, but changes over time, and therefore it would be more correct to call it the Hubble parameter.
The seen and the unseen
In 1965, two American astrophysicists, Arno Penzias and Robert Wilson, while setting up their brand new horn-parabolic antenna, discovered that the Universe was… noisy! More precisely, they recorded excessive temperature noise corresponding to ~3.5 kelvin. After gradually eliminating all possible sources, from various devices to pigeon droppings in the antenna’s mouthpiece, and making sure that the noise was constant and came evenly from all directions, the scientists recognized its cosmic origin. And in 1978, Penzias and Wilson received the Nobel Prize “for the discovery of cosmic microwave background radiation.”
Not that the “noise” of the Universe was a surprise. Its existence had already been predicted by Gamov in the late 1940s – it almost directly followed from the hot initial state of the Universe. When, ~380,000 years after the beginning of the Universe, space finally became transparent to thermal radiation, photons began to fly in all directions. Their energy at that time corresponded to a temperature of 3000 K. But since then, they have “cooled down,” and according to Gamov’s calculations, the background relic radiation should have had a temperature of ~3 K.
This prediction turned out to be amazingly accurate. Satellites such as COBE, WMAP, and their “successor” Planck have measured the energy of the relic radiation with unprecedented accuracy. On average, it corresponds to a temperature of 2.725 K. But in fact, there are tiny inhomogeneities – no more than a few hundred microkelvins. But they reflect the density variations in the very young Universe, and George Smoot and John Mather received the Nobel Prize in 2006 for their discovery (using data from the COBE satellite).

What seems like noise to the vast majority of people is a valuable source of useful information for astrophysicists and cosmologists. By studying these subtle fluctuations in the relic radiation and having a theoretical cosmological model, scientists can calculate the Hubble parameter as it should be at the moment.
The modern standard cosmological model ΛCDM is described by a number of parameters. By varying their values, we can find a set that matches the relic radiation pattern. But here’s the problem: according to Planck data, it turns out that H0 = 67.4 ± 0.5 km/s/Mpc, and not 73, as determined by standard candles!
It would seem that a difference of 10% is not that significant. But in fact, it is statistically significant and makes us think seriously: maybe we do not understand some fundamental factors present in the early Universe? The discrepancy between the values of the Hubble parameter determined in different ways is so worrying to cosmologists that this problem has been called the “Hubble tension” or even the “Hubble crisis” and has gained the reputation of being one of the most difficult problems in modern astrophysics.
Where is the way out of this “conyburrow”?
Over the past decade, numerous attempts have been made to resolve the Hubble tension, in particular with the help of the famous telescope of the same name. For example, in 2019, a study was released that aimed to “recalibrate” the cosmic distance scale. It was based on the observations of 70 cepheids of the Large Magellanic Cloud by the Hubble Space Telescope. The new value of the ill-fated parameter was 74 km/s/Mps.
“Okay,” said the astrophysicist, “we just need a more accurate instrument. Fortunately, we already have such a tool these days – the James Webb Space Telescope (JWST). Did the cooler telescope help solve the fundamental mystery? Unfortunately, no. In 2024, a group of scientists made unprecedentedly accurate measurements of distances to several galaxies by observing cepheids and supernovae with the JWST. And all this only to find that the result is fully consistent with the one obtained earlier by Hubble observations.

And about a month ago, an article was officially published, the authors of which, using mostly JWST observations, estimated the Hubble parameter separately for different types of standard candles. The average value of 69.03 ± 1.75 km/s/Mpc obtained as a result of the study seems to weaken the Hubble tension, although it does not completely remove it.
To finally solve the problem, different groups of scientists put forward various hypotheses. According to one of them, the problem lies in the fact that our Galaxy is close to the edge of a void, a giant cosmic void in which galaxies are extremely rare. In other papers, the authors suggest that dark energy had different parameters in the early stages of the Universe’s development. This is likely to make sense, given that in March 2025, the change in the influence of dark energy over time was quite reliably established. There are also quite extravagant hypotheses, such as the global rotation of the Universe. However, none of them provides a comprehensive answer to the main cosmological crisis.
For the astronomical community, the 20th century was an era of fundamental discoveries that turned the worldview upside down. Within a very short time, humanity discovered the theory of relativity, the fact that the Universe is not limited to our Galaxy but contains an incredible number of “star islands” and that outer space has been expanding rapidly for 13.8 billion years, and, in addition, dark matter and dark energy.
Cosmologists of the 21st century have so far mostly been refining the laws and theories established earlier. However, certain discrepancies in cosmological concepts and observed data are gradually accumulating. The main ones are the Hubble intensity and the detection of abnormally mature galaxies in the era of the very young Universe by the JWST.
Does this mean that we are on the verge of a new revolution in cosmology, waiting for a young, brilliant astrophysicist? Or maybe it will only be a delicate refinement of the initial conditions of the evolution of the Universe that will eliminate the tension of Hubble? Or perhaps the new knight of modern science will do something similar to what Einstein did with Newtonian dynamics a century ago: shatter the established cosmological theories?