Can a Single False Axiom Kill Physics?

Feb 9, 2023
97
1
100
Visit site
The texts below imply that, if the speed of light is VARIABLE, modern physics, predicated on Einstein's 1905 constant-speed-of-light axiom, is long dead (exists in a zombie state):

"He opened by explaining how Einstein's theory of relativity is the foundation of every other theory in modern physics and that the assumption that the speed of light is constant is the foundation of that theory. Thus a constant speed of light is embedded in all of modern physics and to propose a varying speed of light (VSL) is worse than swearing! It is like proposing a language without vowels." http://www.thegreatdebate.org.uk/VSLRevPrnt.html

"If there's one thing every schoolboy knows about Einstein and his theory of relativity, it is that the speed of light in vacuum is constant. No matter what the circumstances, light in vacuum travels at the same speed...The speed of light is the very keystone of physics, the seemingly sure foundation upon which every modern cosmological theory is built, the yardstick by which everything in the universe is measured...The constancy of the speed of light has been woven into the very fabric of physics, into the way physics equations are written, even into the notation used. Nowadays, to "vary" the speed of light is not even a swear word: It is simply not present in the vocabulary of physics." https://www.amazon.com/Faster-Than-Speed-Light-Speculation/dp/0738205257

"The whole of physics is predicated on the constancy of the speed of light...So we had to find ways to change the speed of light without wrecking the whole thing too much." https://motherboard.vice.com/en_us/article/8q87gk/light-speed-slowed

The speed of light is OBVIOUSLY variable. A light source emits equidistant pulses and an observer starts moving towards the source:

View: https://youtube.com/watch?v=bg7O4rtlwEE


The speed of the light pulses relative to the stationary observer is

c = df

where d is the distance between subsequent pulses and f is the frequency at the stationary observer. The speed of the pulses relative to the moving observer is

c'= df' > c

where f' > f is the frequency at the moving observer.
 
While I do not have the qualifications to prove the speed of light, it seems entirely reasonable to stipulate that all moving objects have a speed limit, else everything would travel at infinite speed.

Hence, like all physical phenomena, light has a speed limit, even in a vacuum.
 
Feb 16, 2023
109
14
605
Visit site
Doppler effect when it comes to the speed of light. We have blue shift and red shift, but both still travels at the speed of light, so the statement that light travels at the speed of light in a vacuum holds true
 
  • Like
Reactions: write4u
Feb 9, 2023
97
1
100
Visit site
Albert Einstein: "If the speed of light depends even in the least on the speed of the light source, then my whole theory of relativity, including the theory of gravitation, is wrong." https://einsteinpapers.press.princeton.edu/vol5-trans/376

The speed of light does depend on the speed of the light source, as posited by Newton's theory and proved by the Michelson-Morley experiment in 1887 (prior to the introduction of the length-contraction fudge factor):

"Emission theory, also called emitter theory or ballistic theory of light, was a competing theory for the special theory of relativity, explaining the results of the Michelson–Morley experiment of 1887...The name most often associated with emission theory is Isaac Newton. In his corpuscular theory Newton visualized light "corpuscles" being thrown off from hot bodies at a nominal speed of c with respect to the emitting object, and obeying the usual laws of Newtonian mechanics, and we then expect light to be moving towards us with a speed that is offset by the speed of the distant emitter (c ± v)." https://en.wikipedia.org/wiki/Emission_theory

Banesh Hoffmann, Einstein's co-author, admits that, originally ("without recourse to contracting lengths, local time, or Lorentz transformations"), the Michelson-Morley experiment was compatible with Newton's variable speed of light, c'=c±v, and incompatible with the constant speed of light, c'=c:

"Moreover, if light consists of particles, as Einstein had suggested in his paper submitted just thirteen weeks before this one, the second principle seems absurd: A stone thrown from a speeding train can do far more damage than one thrown from a train at rest; the speed of the particle is not independent of the motion of the object emitting it. And if we take light to consist of particles and assume that these particles obey Newton's laws, they will conform to Newtonian relativity and thus automatically account for the null result of the Michelson-Morley experiment without recourse to contracting lengths, local time, or Lorentz transformations. Yet, as we have seen, Einstein resisted the temptation to account for the null result in terms of particles of light and simple, familiar Newtonian ideas, and introduced as his second postulate something that was more or less obvious when thought of in terms of waves in an ether." Banesh Hoffmann, Relativity and Its Roots, p.92 https://www.amazon.com/Relativity-Its-Roots-Banesh-Hoffmann/dp/0486406768

NEWTON'S VARIABLE SPEED OF LIGHT.png
 
Some researchers have suggested that the speed of light could have been much higher in this early universe. Now, one of this theory's originators, Professor João Magueijo from Imperial College London, working with Dr Niayesh Afshordi at the Perimeter Institute in Canada, has made a prediction that could be used to test the theory's validity.

Structures in the universe, for example galaxies, all formed from fluctuations in the early universe – tiny differences in density from one region to another. A record of these early fluctuations is imprinted on the cosmic microwave background – a map of the oldest light in the universe – in the form of a 'spectral index'.

Working with their theory that the fluctuations were influenced by a varying speed of light in the early universe, Professor Magueijo and Dr Afshordi have now used a model to put an exact figure on the spectral index. The predicted figure and the model it is based on are published in the journal Physical Review D.

Cosmologists are currently getting ever more precise readings of this figure, so that prediction could soon be tested – either confirming or ruling out the team's model of the early universe. Their figure is a very precise 0.96478. This is close to the current estimate of readings of the cosmic microwave background, which puts it around 0.968, with some margin of error.

RADICAL IDEA
Professor Magueijo said: "The theory, which we first proposed in the late-1990s, has now reached a maturity point – it has produced a testable prediction. If observations in the near future do find this number to be accurate, it could lead to a modification of Einstein's theory of gravity.

"The idea that the speed of light could be variable was radical when first proposed, but with a numerical prediction, it becomes something physicists can actually test. If true, it would mean that the laws of nature were not always the same as they are today."

The testability of the varying speed of light theory sets it apart from the more mainstream rival theory: inflation. Inflation says that the early universe went through an extremely rapid expansion phase, much faster than the current rate of expansion of the universe.

These theories are necessary to overcome what physicists call the 'horizon problem'. The universe as we see it today appears to be everywhere broadly the same, for example it has a relatively homogenous density.

This could only be true if all regions of the universe were able to influence each other. However, if the speed of light has always been the same, then not enough time has passed for light to have travelled to the edge of the universe, and 'even out' the energy.

As an analogy, to heat up a room evenly, the warm air from radiators at either end has to travel across the room and mix fully. The problem for the universe is that the 'room' – the observed size of the universe – appears to be too large for this to have happened in the time since it was formed.

The varying speed of light theory suggests that the speed of light was much higher in the early universe, allowing the distant edges to be connected as the universe expanded. The speed of light would have then dropped in a predictable way as the density of the universe changed. This variability led the team to the prediction published today.

The alternative theory is inflation, which attempts to solve this problem by saying that the very early universe evened out while incredibly small, and then suddenly expanded, with the uniformity already imprinted on it. While this means the speed of light and the other laws of physics as we know them are preserved, it requires the invention of an 'inflation field' – a set of conditions that only existed at the time.

Read 'Critical geometry of a thermal big bang' by Niayesh Afshordi and João Magueijo is published in Physical Review D., below:

We explore the space of scalar-tensor theories containing two disformally related metrics, and find a discontinuity pointing to a special “critical” cosmological solution. This solution has a simple geometrical interpretation based on the action of a probe 3-brane embedded in an EAdS2 × E3 geometry. Due to the different maximal speeds of propagation for matter and gravity, the cosmological fluctuations start off inside the horizon even without inflation, and will more naturally have a thermal origin (since there is never vacuum domination). The critical model makes an unambiguous, non-tuned prediction for the spectral index of the scalar fluctuations left outside the horizon: ns = 0.96478(64). Adding to this that no gravitational waves are produced, we have unveiled the most predictive model on offer.

We built upon previous work on ther- mal fluctuations in bimetric scenarios showing that a fast enough phase transition in cs would lead to fluctuations as close to scale-invariance as seen in data

See: https://arxiv.org/pdf/1603.03312v1.pdf

See: https://phys.org/news/2016-11-theory-einstein-physics.html

It was Victor Flambaum, along with John Webb and colleagues, who first seriously challenged alpha’s status as a constant in 1998. Then, after exhaustively analysing how the light from distant quasars was absorbed by intervening gas clouds, they claimed in 2001 that alpha had increased by a few parts in 105 in the past 12 billion years.

But then German researchers studying photons emitted by caesium and hydrogen atoms reported earlier in June that they had seen no change in alpha to within a few parts in 1015 over the period from 1999 to 2003 (New Scientist, 26 June) though the result does not rule out that alpha was changing billions of years ago.

Throughout the debate, physicists who argued against any change in alpha have had one set of data to fall back on. It comes from the world’s only known natural nuclear reactor, found at Oklo in Gabon, West Africa.

The Oklo reactor started up nearly two billion years ago when groundwater filtered through crevices in the rocks and mixed with uranium ore to trigger a fission reaction that was sustained for hundreds of thousands of years. Several studies that have analysed the relative concentrations of radioactive isotopes left behind at Oklo have concluded that nuclear reactions then were much the same as they are today, which implies alpha was the same too.

That is because alpha directly influences the ratio of these isotopes. In a nuclear chain reaction like the one that occurred at Oklo, the fission of each uranium-235 nucleus produces neutrons, and nearby nuclei can capture these neutrons.

For example, samarium-149 captures a neutron to become samarium-150, and since the rate of neutron capture depends on the value of alpha, the ratio of the two samarium isotopes in samples collected from Oklo can be used to calculate alpha.

A number of studies done since Oklo was discovered have found no change in alpha over time. “People started quoting the reactor [data] as firm evidence that the constants hadn’t changed,” says Steve Lamoreaux of Los Alamos National Lab (LANL) in Los Alamos, New Mexico.

Energy spectrum
Now, Lamoreaux, along with LANL colleague Justin Torgerson, has re-analysed the Oklo data using what he says are more realistic figures for the energy spectrum of the neutrons present in the reactor. The results have surprised him. Alpha, it seems, has decreased by more than 4.5 parts in 108 since Oklo was live (Physical Review D, vol 69, p121701).

That translates into a very small increase in the speed of light (assuming no change in the other constants that alpha depends on), but Lamoreaux’s new analysis is so precise that he can rule out the possibility of zero change in the speed of light. “It’s pretty exciting,” he says.

So far the re-examination of the Oklo data has not drawn any fire. “The analysis is fine,” says Thibault Damour of the Institute of Advanced Scientific Studies (IHES) in Bures-sur-Yvette in France, who co-authored a 1996 Oklo study that found no change in alpha. Peter Moller of LANL, who, along with Japanese researchers, published a paper in 2000 about the Oklo reactor that also found no change in alpha, says that Lamoreaux’s assumptions are reasonable.

The analysis might be sound, and the assumptions reasonable, but some physicists are reluctant to accept the conclusions. “I can’t see a particular mistake,” says Flambaum. “However, the claim is so revolutionary there should be many independent confirmations.”

While Flambaum’s own team found that alpha was different 12 billion years ago, the new Oklo result claims that alpha was changing as late as two billion years ago. If other methods confirm the Oklo finding, it will leave physicists scrambling for new theories. “It’s like opening a gateway,” says Dmitry Budker, a colleague of Lamoreaux’s at the University of California at Berkeley.

Horizon problem
Some physicists would happily accept a variable alpha. For example, if it had been lower in the past, meaning a higher speed of light, it would solve the “horizon problem”.

Cosmologists have struggled to explain why far-flung regions of the universe are at roughly the same temperature. It implies that these regions were once close enough to exchange energy and even out the temperature, yet current models of the early universe prevent this from happening, unless they assume an ultra-fast expansion right after the big bang.

However, a higher speed of light early in the history of the universe would allow energy to pass between these areas in the form of light.

Variable “constants” would also open the door to theories that used to be off limits, such as those which break the laws of conservation of energy. And it would be a boost to versions of string theory in which extra dimensions change the constants of nature at some places in space-time.

But “there is no accepted varying-alpha theory”, warns Flambaum. Instead, there are competing theories, from those that predict a linear rate of change in alpha, to those that predict rapid oscillations. John Barrow, who has pioneered varying-alpha theories at the University of Cambridge, says that the latest Oklo result does not favour any of the current theories. “You would expect alpha to stop [changing] five to six billion years ago,” he says.

Reaction rate
Before Lamoreaux’s Oklo study can count in favour of any varying alpha theory, there are some issues to be addressed. For one, the exact conditions at Oklo are not known. Nuclear reactions run at different rates depending on the temperature of the reactor, which Lamoreaux assumed was between 227 and 527°C.

Damour says the temperature could vary far more than this. “You need to reconstruct the temperature two billion years ago deep down in the ground,” he says.

Damour also argues that the relative concentrations of samarium isotopes may not be as well determined as Lamoreaux has assumed, which would make it impossible to rule out an unchanging alpha. But Lamoreaux points out that both assumptions about the temperature of the Oklo reactor and the ratio of samarium isotopes were accepted in previous Oklo studies.

Another unknown is whether other physical constants might have varied along with, or instead of, alpha. Samarium-149’s ability to capture a neutron also depends on another constant, alpha(s), which governs the strength of the strong nuclear attraction between the nucleus and the neutron.

And in March, Flambaum claimed that the ratio of different elements left over from just after the big bang suggests that alpha(s) must have been different then compared with its value today (Physical Review D, vol 69, p 063506).

While Lamoreaux has not addressed any possible change in alpha(s) in his Oklo study, he argues that it is important to focus on possible changes in alpha because the Oklo data has become such a benchmark in the debate over whether alpha can vary. “I’ve spent my career going back and checking things that are ‘known’ and it always leads to new ideas,” he says.

See: https://www.newscientist.com/article/dn6092-speed-of-light-may-have-changed-recently/

The speed of light in a medium is given by 1𝜖𝜇√1ϵμ, where 𝜖ϵ and 𝜇μ are the relative permittivity and permeability of the medium respectively (these in turn affect how large the electric & magnetic forces in the medium are).

Light slows as it travels through a medium other than vacuum (such as air, glass or water). This is not because of scattering or absorption. Rather it is because, as an electromagnetic oscillation, light itself causes other electrically charged particles such as electrons, to oscillate. The oscillating electrons emit their own electromagnetic waves which superpose* with the original light. The resulting “combined” wave has wave packets that pass an observer at a slower rate. The light has effectively been slowed down. When light returns to a vacuum and there are no electrons nearby or which light may superpose, this slowing effect ends and its speed returns to 𝑐c.

Since the permittivity & permeability of materials vary, it's not surprising that the speed of light varies as well.

* Superpose or superposition: is the ability of a quantum system to be in multiple states at the same time until it is measured.

Because the concept is difficult to understand, this essential principle of quantum mechanics is often illustrated by an experiment carried out in 1801 by the English physicist, Thomas Young. Young's double-slit experiment was intended to prove that light consists of waves. Today, the experiment is used to help people understand the way that electrons can act like waves and create interference patterns.

For this experiment, a beam of light is aimed at a barrier with two vertical slits. The light passes through the slits and the resulting pattern is recorded on a photographic plate. When one slit is covered, the pattern is what would be expected: a single line of light, aligned with whichever slit is open.

Intuitively, one would expect that if both slits are open, the pattern of light will reflect two lines of light aligned with the slits. In fact, what happens is that the photographic plate separates into multiple lines of lightness and darkness in varying degrees.

What is being illustrated by this result is that interference is taking place between the waves going through the slits, in what, seemingly, should be two non-crossing trajectories. Each photon not only goes through both slits; it simultaneously takes every possible trajectory en route to the photographic plate.

In order to see how this might possibly occur, other experiments have focused on tracking the paths of individual photons. Surprisingly, the measurement in some way disrupts the photons' trajectories and somehow, the results of the experiment become what would be predicted by classical physics: two bright lines on the photographic plate, each aligned with the slits in the barrier. This has led scientists to conclude that superposition cannot be directly observed; one can only observe the resulting consequence, interference.

In computing, the concept of superposition has important implications for the way information will be processed and stored in the future. For example, today's classical computers process information in bits of one or zero, similar to a light switch being turned on or off. The quantum supercomputers of tomorrow, however, will process information as qubits -- one, zero or a superposition of the two states.

See: https://www.techtarget.com/whatis/definition/superposition

* The world-volume theory of a three-brane of type IIB theory in the presence of a configuration of four Dirichlet seven-branes and an orientifold plane is described by an N=2 supersymmetric SU(2) gauge theory with four quark flavors in 3+1 dimensions. The BPS mass formula for N=2 supersymmetric gauge theory arises from masses of open strings stretched between the three-brane and the seven-brane along appropriate geodesics.

See: https://arxiv.org/pdf/hep-th/9608005.pdf

Read: 'Color Superconductivity in N = 2 Supersymmetric Gauge Theories', by
Masato Arai and Nobuchika Okada, https://arxiv.org/pdf/hep-th/0512234.pdf

Due to the different maximal speeds of propagation for matter and gravity, the cosmological fluctuations start off inside the horizon even without inflation or the inflation, and will more naturally have a thermal origin. This critical model makes an unambiguous, nontuned prediction for the spectral index of the scalar fluctuations: nS=0.964784 . Considering also that no gravitational waves are produced, we have unveiled the most predictive model on offer. The model has a simple geometrical interpretation as a probe 3-brane* embedded in an E AdS2×E3 geometry.

The varying speed of light theory suggests that the speed of light was much higher in the early universe, allowing the distant edges to be connected as the universe expanded. The speed of light would have then dropped in a predictable way as the density of the universe changed.

The alternative theory is inflation, which attempts to solve this problem by saying that the very early universe evened out while incredibly small, and then suddenly expanded, with the uniformity already imprinted on it. While this means the speed of light and the other laws of physics as we know them are preserved, it requires the invention of an 'inflation field' – a set of conditions that only existed at the time.
Hartmann352
 
Feb 9, 2023
97
1
100
Visit site
"You want to go back to a notion of space-time that preceded the 20th century, and it wants to ignore the essential lessons about space-time that the 20th century has taught us." Joao Magueijo: "Yes, that's right. So it's nouveau-Newtonian." https://pirsa.org/16060116?t=3211

"Lee [Smolin] and I discussed these paradoxes at great length for many months, starting in January 2001. We would meet in cafés in South Kensington or Holland Park to mull over the problem. THE ROOT OF ALL THE EVIL WAS CLEARLY SPECIAL RELATIVITY. All these paradoxes resulted from well known effects such as length contraction, time dilation, or E=mc^2, all basic predictions of special relativity." Joao Magueijo, Faster Than the Speed of Light, p. 250 http://www.amazon.com/Faster-Than-Speed-Light-Speculation/dp/0738205257

Joao Magueijo, Niayesh Afshordi, Stephon Alexander: "So we have broken fundamentally this Lorentz invariance which equates space and time...It is the other postulate of relativity, that of constancy of c, that has to give way..."
View: https://youtu.be/kbHBBtsrU1g?t=1431


The "root of all the evil" in physics is Einstein's 1905 constant-speed-of-light falsehood. Physicists know that, start telling the truth sometimes, but then stop halfway. Modern physics is predicated on the falsehood and would collapse without it. In this regard, telling the truth is suicidal:

"...Dr. Magueijo said. "We need to drop a postulate, perhaps the constancy of the speed of light." http://www.nytimes.com/2002/12/31/science/e-and-mc2-equality-it-seems-is-relative.html

"He opened by explaining how Einstein's theory of relativity is the foundation of every other theory in modern physics and that the assumption that the speed of light is constant is the foundation of that theory. Thus a constant speed of light is embedded in all of modern physics and to propose a varying speed of light (VSL) is worse than swearing! It is like proposing a language without vowels." http://www.thegreatdebate.org.uk/VSLRevPrnt.html

"The whole of physics is predicated on the constancy of the speed of light," Joao Magueijo, a cosmologist at Imperial College London and pioneer of the theory of variable light speed, told Motherboard. "So we had to find ways to change the speed of light without wrecking the whole thing too much." https://motherboard.vice.com/en_us/article/8q87gk/light-speed-slowed
 
Last edited:
Feb 16, 2023
109
14
605
Visit site
Inflation theory makes sense when you view our universe as a bubble in the quantum foam that is outside of our universe.
The laws of physics outside of our universe is bound to operate differently than those contained in our little bubble.
So that outside forces inflated our universe is plausible.