Some researchers have suggested that the speed of light could have been much higher in this early universe. Now, one of this theory's originators, Professor João Magueijo from Imperial College London, working with Dr Niayesh Afshordi at the Perimeter Institute in Canada, has made a prediction that could be used to test the theory's validity.
Structures in the universe, for example galaxies, all formed from fluctuations in the early universe – tiny differences in density from one region to another. A record of these early fluctuations is imprinted on the
cosmic microwave background – a map of the oldest light in the universe – in the form of a 'spectral index'.
Working with their theory that the fluctuations were influenced by a varying speed of light in the early universe, Professor Magueijo and Dr Afshordi have now used a model to put an exact figure on the spectral index. The predicted figure and the model it is based on are published in the journal
Physical Review D.
Cosmologists are currently getting ever more precise readings of this figure, so that prediction could soon be tested – either confirming or ruling out the team's model of the early universe. Their figure is a very precise 0.96478. This is close to the current estimate of readings of the cosmic microwave background, which puts it around 0.968, with some margin of error.
RADICAL IDEA
Professor Magueijo said: "The theory, which we first proposed in the late-1990s, has now reached a maturity point – it has produced a testable prediction. If observations in the near future do find this number to be accurate, it could lead to a modification of Einstein's theory of gravity.
"The idea that the speed of light could be variable was radical when first proposed, but with a numerical prediction, it becomes something physicists can actually test. If true, it would mean that the laws of nature were not always the same as they are today."
The testability of the varying speed of light theory sets it apart from the more mainstream rival theory: inflation. Inflation says that the early universe went through an extremely rapid expansion phase, much faster than the current rate of expansion of the universe.
These theories are necessary to overcome what physicists call the 'horizon problem'. The universe as we see it today appears to be everywhere broadly the same, for example it has a relatively homogenous density.
This could only be true if all regions of the universe were able to influence each other. However, if the speed of light has always been the same, then not enough time has passed for light to have travelled to the edge of the universe, and 'even out' the energy.
As an analogy, to heat up a room evenly, the warm air from radiators at either end has to travel across the room and mix fully. The problem for the universe is that the 'room' – the observed size of the universe – appears to be too large for this to have happened in the time since it was formed.
The varying speed of light theory suggests that the speed of light was much higher in the early universe, allowing the distant edges to be connected as the universe expanded. The speed of light would have then dropped in a predictable way as the density of the universe changed. This variability led the team to the prediction published today.
The alternative
theory is inflation, which attempts to solve this problem by saying that the very early universe evened out while incredibly small, and then suddenly expanded, with the uniformity already imprinted on it. While this means the speed of light and the other laws of physics as we know them are preserved, it requires the invention of an 'inflation field' – a set of conditions that only existed at the time.
Read 'Critical geometry of a thermal
big bang' by Niayesh Afshordi and João Magueijo is published in
Physical Review D., below:
We explore the space of scalar-tensor theories containing two disformally related metrics, and find a discontinuity pointing to a special “critical” cosmological solution. This solution has a simple geometrical interpretation based on the action of a probe 3-brane embedded in an EAdS2 × E3 geometry. Due to the different maximal speeds of propagation for matter and gravity, the cosmological fluctuations start off inside the horizon even without inflation, and will more naturally have a thermal origin (since there is never vacuum domination). The critical model makes an unambiguous, non-tuned prediction for the spectral index of the scalar fluctuations left outside the horizon: ns = 0.96478(64). Adding to this that no gravitational waves are produced, we have unveiled the most predictive model on offer.
We built upon previous work on ther- mal fluctuations in bimetric scenarios showing that a fast enough phase transition in cs would lead to fluctuations as close to scale-invariance as seen in data
See:
https://arxiv.org/pdf/1603.03312v1.pdf
See:
https://phys.org/news/2016-11-theory-einstein-physics.html
It was Victor Flambaum, along with John Webb and colleagues, who first seriously challenged alpha’s status as a constant in 1998. Then, after exhaustively analysing how the light from distant quasars was absorbed by intervening gas clouds, they claimed in 2001 that alpha had increased by a few parts in 105 in the past 12 billion years.
But then German researchers studying photons emitted by caesium and hydrogen atoms reported earlier in June that they had seen no change in alpha to within a few parts in 1015 over the period from 1999 to 2003 (
New Scientist, 26 June) though the result does not rule out that alpha was changing billions of years ago.
Throughout the debate, physicists who argued against any change in alpha have had one set of data to fall back on. It comes from the world’s only known natural nuclear reactor, found at Oklo in Gabon, West Africa.
The Oklo reactor started up nearly two billion years ago when groundwater filtered through crevices in the rocks and mixed with uranium ore to trigger a fission reaction that was sustained for hundreds of thousands of years. Several studies that have analysed the relative concentrations of radioactive isotopes left behind at Oklo have concluded that nuclear reactions then were much the same as they are today, which implies alpha was the same too.
That is because alpha directly influences the ratio of these isotopes. In a nuclear chain reaction like the one that occurred at Oklo, the fission of each uranium-235 nucleus produces neutrons, and nearby nuclei can capture these neutrons.
For example, samarium-149 captures a neutron to become samarium-150, and since the rate of neutron capture depends on the value of alpha, the ratio of the two samarium isotopes in samples collected from Oklo can be used to calculate alpha.
A number of studies done since Oklo was discovered have found no change in alpha over time. “People started quoting the reactor [data] as firm evidence that the constants hadn’t changed,” says Steve Lamoreaux of Los Alamos National Lab (LANL) in Los Alamos, New Mexico.
Energy spectrum
Now, Lamoreaux, along with LANL colleague Justin Torgerson, has re-analysed the Oklo data using what he says are more realistic figures for the energy spectrum of the neutrons present in the reactor. The results have surprised him. Alpha, it seems, has decreased by more than 4.5 parts in 108 since Oklo was live (
Physical Review D, vol 69, p121701).
That translates into a very small increase in the speed of light (assuming no change in the other constants that alpha depends on), but Lamoreaux’s new analysis is so precise that he can rule out the possibility of zero change in the speed of light. “It’s pretty exciting,” he says.
So far the re-examination of the Oklo data has not drawn any fire. “The analysis is fine,” says Thibault Damour of the Institute of Advanced Scientific Studies (IHES) in Bures-sur-Yvette in France, who co-authored a 1996 Oklo study that found no change in alpha. Peter Moller of LANL, who, along with Japanese researchers, published a paper in 2000 about the Oklo reactor that also found no change in alpha, says that Lamoreaux’s assumptions are reasonable.
The analysis might be sound, and the assumptions reasonable, but some physicists are reluctant to accept the conclusions. “I can’t see a particular mistake,” says Flambaum. “However, the claim is so revolutionary there should be many independent confirmations.”
While Flambaum’s own team found that alpha was different 12 billion years ago, the new Oklo result claims that alpha was changing as late as two billion years ago. If other methods confirm the Oklo finding, it will leave physicists scrambling for new theories. “It’s like opening a gateway,” says Dmitry Budker, a colleague of Lamoreaux’s at the University of California at Berkeley.
Horizon problem
Some physicists would happily accept a variable alpha. For example, if it had been lower in the past, meaning a higher speed of light, it would solve the “horizon problem”.
Cosmologists have struggled to explain why far-flung regions of the universe are at roughly the same temperature. It implies that these regions were once close enough to exchange energy and even out the temperature, yet current models of the early universe prevent this from happening, unless they assume an ultra-fast expansion right after the big bang.
However, a higher speed of light early in the history of the universe would allow energy to pass between these areas in the form of light.
Variable “constants” would also open the door to theories that used to be off limits, such as those which break the laws of conservation of energy. And it would be a boost to versions of string theory in which extra dimensions change the constants of nature at some places in space-time.
But “there is no accepted varying-alpha theory”, warns Flambaum. Instead, there are competing theories, from those that predict a linear rate of change in alpha, to those that predict rapid oscillations. John Barrow, who has pioneered varying-alpha theories at the University of Cambridge, says that the latest Oklo result does not favour any of the current theories. “You would expect alpha to stop [changing] five to six billion years ago,” he says.
Reaction rate
Before Lamoreaux’s Oklo study can count in favour of any varying alpha theory, there are some issues to be addressed. For one, the exact conditions at Oklo are not known. Nuclear reactions run at different rates depending on the temperature of the reactor, which Lamoreaux assumed was between 227 and 527°C.
Damour says the temperature could vary far more than this. “You need to reconstruct the temperature two billion years ago deep down in the ground,” he says.
Damour also argues that the relative concentrations of samarium isotopes may not be as well determined as Lamoreaux has assumed, which would make it impossible to rule out an unchanging alpha. But Lamoreaux points out that both assumptions about the temperature of the Oklo reactor and the ratio of samarium isotopes were accepted in previous Oklo studies.
Another unknown is whether other physical constants might have varied along with, or instead of, alpha. Samarium-149’s ability to capture a neutron also depends on another constant, alpha(s), which governs the strength of the strong nuclear attraction between the nucleus and the neutron.
And in March, Flambaum claimed that the ratio of different elements left over from just after the big bang suggests that alpha(s) must have been different then compared with its value today (
Physical Review D, vol 69, p 063506).
While Lamoreaux has not addressed any possible change in alpha(s) in his Oklo study, he argues that it is important to focus on possible changes in alpha because the Oklo data has become such a benchmark in the debate over whether alpha can vary. “I’ve spent my career going back and checking things that are ‘known’ and it always leads to new ideas,” he says.
See:
https://www.newscientist.com/article/dn6092-speed-of-light-may-have-changed-recently/
The speed of light in a medium is given by 1𝜖𝜇√1ϵμ, where 𝜖ϵ and 𝜇μ are the relative permittivity and permeability of the medium respectively (these in turn affect how large the electric & magnetic forces in the medium are).
Light slows as it travels through a medium other than vacuum (such as air, glass or water). This is not because of scattering or absorption. Rather it is because, as an electromagnetic oscillation, light itself causes other electrically charged particles such as electrons, to oscillate. The oscillating electrons emit their own electromagnetic waves which superpose* with the original light. The resulting “combined” wave has wave packets that pass an observer at a slower rate. The light has effectively been slowed down. When light returns to a vacuum and there are no electrons nearby or which light may superpose, this slowing effect ends and its speed returns to 𝑐c.
Since the permittivity & permeability of materials vary, it's not surprising that the speed of light varies as well.
* Superpose or superposition: is the ability of a
quantum system to be in multiple states at the same time until it is measured.
Because the concept is difficult to understand, this essential principle of
quantum mechanics is often illustrated by an experiment carried out in 1801 by the English physicist, Thomas Young. Young's
double-slit experiment was intended to prove that light consists of waves. Today, the experiment is used to help people understand the way that electrons can act like waves and create
interference patterns.
For this experiment, a beam of light is aimed at a barrier with two vertical slits. The light passes through the slits and the resulting pattern is recorded on a photographic plate. When one slit is covered, the pattern is what would be expected: a single line of light, aligned with whichever slit is open.
Intuitively, one would expect that if both slits are open, the pattern of light will reflect two lines of light aligned with the slits. In fact, what happens is that the photographic plate separates into multiple lines of lightness and darkness in varying degrees.
What is being illustrated by this result is that interference is taking place between the waves going through the slits, in what, seemingly, should be two non-crossing trajectories. Each photon not only goes through both slits; it simultaneously takes every possible trajectory en route to the photographic plate.
In order to see how this might possibly occur, other experiments have focused on tracking the paths of individual photons. Surprisingly, the measurement in some way disrupts the photons' trajectories and somehow, the results of the experiment become what would be predicted by classical physics: two bright lines on the photographic plate, each aligned with the slits in the barrier. This has led scientists to conclude that superposition cannot be directly observed; one can only observe the resulting consequence, interference.
In computing, the concept of superposition has important implications for the way information will be processed and stored in the future. For example, today's classical computers process information in bits of one or zero, similar to a light switch being turned on or off. The quantum supercomputer
s of tomorrow, however, will process information as
qubits -- one, zero or a superposition of the two states.
See:
https://www.techtarget.com/whatis/definition/superposition
* The world-volume theory of a three-brane of type IIB theory in the presence of a configuration of four Dirichlet seven-branes and an orientifold plane is described by an N=2 supersymmetric SU(2) gauge theory with four quark flavors in 3+1 dimensions. The BPS mass formula for N=2 supersymmetric gauge theory arises from masses of open strings stretched between the three-brane and the seven-brane along appropriate geodesics.
See:
https://arxiv.org/pdf/hep-th/9608005.pdf
Read: 'Color Superconductivity in N = 2 Supersymmetric Gauge Theories', by
Masato Arai and Nobuchika Okada,
https://arxiv.org/pdf/hep-th/0512234.pdf
Due to the different maximal speeds of propagation for matter and gravity, the cosmological fluctuations start off inside the horizon even without inflation or the inflation, and will more naturally have a thermal origin. This critical model makes an unambiguous, nontuned prediction for the spectral index of the scalar fluctuations: nS=0.964784 . Considering also that no gravitational waves are produced, we have unveiled the most predictive model on offer. The model has a simple geometrical interpretation as a probe 3-brane* embedded in an E AdS2×E3 geometry.
The varying speed of light theory suggests that the speed of light was much higher in the early universe, allowing the distant edges to be connected as the universe expanded. The speed of light would have then dropped in a predictable way as the density of the universe changed.
The alternative
theory is inflation, which attempts to solve this problem by saying that the very early universe evened out while incredibly small, and then suddenly expanded, with the uniformity already imprinted on it. While this means the speed of light and the other laws of physics as we know them are preserved, it requires the invention of an 'inflation field' – a set of conditions that only existed at the time.
Hartmann352