Scientists discovered a new particle by comparing data recorded at the LHC and the Tevatron.

Jan 27, 2020
230
83
680
The odd(eron) couple
07/06/21
By Sarah Charley

In 2018, physicist Carlos Avila received a thrilling request from an old colleague.

“It was the type of call that every scientist wants to have,” says Avila, who is a professor at the Universidad de los Andes in Colombia.

The TOTEM experiment at CERN near Geneva, Switzerland, had recently announced evidence for an elusive quasi-particle that had been a missing link in physicists’ understanding of protons. But according to physicist Christophe Royon, the “TOTEM data alone was not enough.” To get the complete picture, Royon, who is a physicist at the University of Kansas, wanted to revisit data from the DZero experiment at the Tevatron, a particle accelerator that operated between 1987 and 2011 at the US Department of Energy’s Fermi National Accelerator Laboratory.

“It was very exciting that these old measurements we had published in 2012 were still very important and could still play a role in this ongoing research,” Avila says.

Conducting a joint analysis with two experiments from different generations wasn’t easy. It required rewriting decades-old software and inventing a new way to compare different types of data. In the end, the collaboration led to the discovery of a new particle: the odderon.

The Tevatron and its two experiments—DZero and CDF—rose to fame in 1995 with the discovery of the top quark, the heaviest known fundamental particle.

“It was really a high point,” says DZero co-spokesperson Paul Grannis. “Everybody was walking on air.”

At the time of the top quark discovery, CERN was constructing a new particle accelerator, the Large Hadron Collider, designed to reach energies an order of magnitude greater than the Tevatron. As the name suggests, the LHC collides a type of subatomic particle called hadrons, usually protons. The Tevatron also used protons, but collided them with their antimatter equivalents, antiprotons.

The LHC started colliding protons in March 2010. A year and a half later, operators at Fermilab threw a big red switch and reverentially ended operations at the Tevatron. Over the next few years, Grannis watched the DZero collaboration shrink from several hundred scientists to just a handful of active researchers.

“The people move on,” Grannis says. “There is less and less memory of the details of the experiment.”

Avila and Royon were among the physicists that transitioned from DZero at the Tevatron to experiments at the LHC. Before bidding adieu, Avila worked on one last paper that compared DZero’s results with the first data from the LHC’s TOTEM experiment. Even though the energies of the two accelerators were different, many theorists expected DZero and TOTEM’s results to look similar. But they didn’t.

“The DZero paper said that—despite all possible interpretation—they did not have the same pattern as seen at the LHC,” says TOTEM spokesperson Simone Giani. “That paper was the spark that triggered us to see the possibility of working together.”

DZero and TOTEM were both looking at patterns from a type of interaction called elastic scattering, in which fast-moving hadrons meet and exchange particles without breaking apart. Grannis likens it to two hockey players passing a heavy puck.

“If Sam slides a big hockey puck to Flo, Sam is going to recoil when he throws it, and Flo will recoil when she catches it,” he says.

Like the hockey players, the hadrons drift off course after passing the “puck.” Both DZero and TOTEM have specialized detectors a few hundred meters from the interaction points to capture the deflected “Sams” and “Flos.” By measuring their momenta and how much their trajectories changed, physicists can deduce the properties of the puck that passed between them.

In the elastic scattering that DZero and TOTEM study, these subatomic pucks are almost exclusively gluons: force-carrying subatomic particles that live inside hadrons. Because of quantum mechanical conservation laws, the exchanged gluons must always clump with other gluons. Scientists study these gluon-clump exchanges to learn about the structure of matter.

“Every time we turn on a new accelerator, we hope to reach a high enough energy to see the internal workings of protons,” Giani says. “There is this ambition to purely distill the effect of the gluons and not that of the quarks.”

Scattering data had already revealed that gluons can clump in even numbers and move between passing hadrons. But scientists were unsure if this same principle would apply to clumps consisting of an odd number of gluons. Theorists predicted the existence of these odd-numbered clumps, which they called odderons, 50 years ago. But odderons had never been observed experimentally.

When physicists build a new flagship accelerator, they almost always make a major leap in energy. But they also make other changes, such as what kinds of particles to use in the collider. Because of this, comparing scattering data from different generations of accelerators—such as the Tevatron and LHC—has been difficult.

“It has been impossible to disentangle if the scattering discrepancies are because of the intrinsic differences between protons and antiprotons, or because the energy of the accelerator is different every time,” Giani says.

But physicists realized that these discrepancies between the Tevatron and LHC might be a blessing and not a curse. In fact, they thought they could be essential for uncovering the odderon.

The matter or antimatter nature of the colliding hadrons would be unimportant if odderons didn’t exist and all the gluon “pucks” contained an even number of gluons. But the identities of these hadronic “Sams” and “Flos” (and specifically, whether Sam and Flo are both made from matter, or whether one of them is made from antimatter) should influence how easily they can exchange odderons.

“The cleanest way to observe the odderon would be to look for differences between proton-proton and proton-antiproton interactions,” says Royon. “And what is the only recently available data for proton-antiproton interactions? This is the Tevatron.”

The plan for TOTEM to work with DZero solidified in 2018 over drinks at CERN’s Restaurant 1.

“When we did a rough comparison [between the Tevatron and LHC results] on a piece of paper, we already saw some differences,” Royon says. “This was the starting point.”

A few months later, Avila was remotely logging into his old Fermilab account and trying to access the approximately 20 gigabytes of Tevatron data that he and his colleagues had analyzed years earlier.

“The first time we tried to look at the data, none of the codes that we were using 10 years ago were working,” Avila says. “The software was already obsolete. We had to restore all the software and put it together with newer versions.”

Another big challenge was comparing the Tevatron data with the LHC data and compensating for the different energies of the two accelerators. “That was the tricky part,” Grannis says.

The DZero and TOTEM researchers regularly met over Zoom to check in on their progress and discuss ideas for how they could compare their data in the same energy regime.

“The DZero people were concentrating on extracting the best possible information from DZero data, and the TOTEM people were doing the same for TOTEM,” Royon says. “My job was to unify the two communities.”

If the odderon didn’t exist, then DZero and TOTEM should have seen the same scattering patterns in their data after adjusting for the energy differences between the Tevatron and LHC. But no matter how they processed the data, the scattering patterns remained distinct.

“We did many cross checks,” Royon says. “It took one year to make sure we were correct.”

The discrepancy between the proton-proton and proton-antiproton data showed that these hadrons were passing a new kind of subatomic puck. When combined with the 2018 TOTEM analysis, they had a high enough statistical significance to claim a discovery: They had finally found the odderon.

An international team of scientists worked on the research. The US contribution was funded by the US Department of Energy and the National Science Foundation. “This is definitely the result of hard work from hundreds of people originating from everywhere in the world,” Royon says.

For Avila, the discovery was just one of the many bonuses associated with teaming up with his old DZero colleagues on this new project. “You build strong friendships while doing research,” he says. “Even if you don’t stay in touch closely, you know these people and you know that working with them is really exciting.”

Avila also says this discovery shows the value of keeping the legacy of older experiments alive.

“We shouldn’t forget about this old data,” Avila says. “It can still bring new details about how nature behaves. It has a good scientific value no matter how many years have passed.”

* The strong force, or strong interaction, is one of the four known fundamental interactions, with the others being electromagnetism, the weak interaction, and gravitation. At the range, or distance, of 10⁻¹⁵ m, the strong force is approximately 137 times as strong as electromagnetism, 10⁶ times as strong as the weak interaction, and 10³⁸ times as strong as gravitation.

A force which can hold a nucleus together against the enormous forces of repulsion of the protons is strong indeed. However, it is not an inverse square force like the electromagnetic force and it has a very short range. Yukawa modeled the strong force as an exchange force in which the exchange particles are pions and other heavier particles. The range of a particle exchange force is limited by the uncertainty principle. It is the strongest of the four fundamental forces.

Since the protons and neutrons which make up the nucleus are themselves considered to be made up of quarks, and the quarks are considered to be held together by the color force, the strong force between nucleons may be considered to be a residual color force. In the standard model, therefore, the basic exchange particle is the gluon which mediates the forces between quarks. Since the individual gluons and quarks are contained within the proton or neutron, the masses attributed to them cannot be used in the range relationship to predict the range of the force. When something is viewed as emerging from a proton or neutron, then it must be at least a quark-antiquark pair, so it is then plausible that the pion as the lightest meson should serve as a predictor of the maximum range of the strong force between nucleons.


The sketch, also known as a Feynman diagram**, is an attempt to show one of many forms the gluon interaction between nucleons could take, this one involves an up-antiup pair production and annihilation and producing a π-bridging the nucleons.

See: https://aether.lbl.gov/elements/stellar/strong/strong.html

See: https://energyeducation.ca/encyclopedia/Strong_nuclear_force

** A Feynman diagram is a graphical method of representing the interactions of elementary particles, invented in the 1940's and ’50s by the American theoretical physicist Richard P. Feynman. Introduced during the development of the theory of quantum electrodynamics*** as an aid for visualizing and calculating the effects of electromagnetic interactions among electrons and photons, Feynman diagrams, because of their graphic simplicity, are now used to depict all types of particle interactions.

A Feynman diagram is a two-dimensional representation in which one axis, usually the horizontal axis, is chosen to represent space, while the second (vertical) axis represents time. Straight lines are used to depict fermions—fundamental particles with half-integer values of intrinsic angular momentum(spin), such as electrons (e−)—and wavy lines are used for bosons—particles with integer values of spin, such as photons (γ). On a conceptual level fermions may be regarded as “matter” particles, which experience the effect of a force arising from the exchange of bosons, so-called “force-carrier,” or field, particles.

At the quantum level the interactions of fermions occur through the emission and absorption of the field particles associated with the fundamental interactions, or forces, of matter, in particular the electromagnetic force, the strong force, and the weak force. The basic interaction therefore appears on a Feynman diagram as a "vertex"—i.e., a junction of three lines. In this way the path of an electron, for example, appears as two straight lines connected to a third, wavy, line where the electron emits or absorbs a photon.

Feynman diagrams are used by physicists to make very precise calculations of the probability of any given process, such as electron-electron scattering, for example, in quantum electrodynamics. The calculations must include terms equivalent to all the lines (representing propagating particles) and all the vertices (representing interactions) shown in the diagram. In addition, since a given process can be represented by many possible Feynman diagrams, the contributions of every possible diagram must be entered into the calculation of the total probability that a particular process will occur. Comparison of the results of these calculations with experimental measurements have revealed an extraordinary level of accuracy, with agreement to nine significant digits in some cases.

See: https://www.britannica.com/science/Feynman-diagram

*** Quantum electrodynamics (QED) is a field of physics that studies the interaction of electromagnetic radiation with electrically charged matter within the framework of relativity and quantum mechanics. More plainly put, it is a relativistic quantum field theory of electromagnetism. It basically describes how light and matter interact. More specifically it deals with the interactions between electrons, positrons and photons.

It is the fundamental theory underlying all disciplines of science concerned with electromagnetism, such as atomic physics, chemistry, biology, the theory of bulk matter, and electromagnetic radiation.

It has been called "the jewel of physics" for its extremely accurate predictions of quantities like the anomalous magnetic moment of the electron, and the Lamb shift of the energy levels of hydrogen.

It is the first physical theory ever developed that has no obvious intrinsic limitation and describes physical quantities from first principles. Nature accommodates forces other than the electromagnetic force, such as those responsible for radioactive disintegration of heavy nuclei (called the weak force) and the force that binds the nucleus together (called the strong force). A theory called the standard model, has been developed which unifies the three forces and accounts for all experimental data from very low to extremely high energies. This does not mean, however, that quantum electrodynamics fails at high energies. It simply means that the real world has forces other than electromagnetism.

The word 'quantum' is Latin, meaning "how much" (neut. sing. of quantus "how great"). The word 'electrodynamics' was coined by André-Marie Ampère in 1822. The word 'quantum', as used in physics, i.e. with reference to the notion of count, was first used by Max Planck, in 1900 and reinforced by Einstein in 1905 with his use of the term light quanta.

Quantum theory began in 1900, when Max Planck assumed that energy is quantized in order to derive a formula predicting the observed frequency dependence of the energy emitted by a black body. This dependence is completely at variance with classical physics. In 1905, Einstein explained the photoelectric effect by postulating that light energy comes in quanta later called photons. In 1913, Bohr invoked quantization in his proposed explanation of the spectral lines of the hydrogen atom. In 1924, Louis de Broglie proposed a quantum theory of the wave-like nature of subatomic particles. The phrase "quantum physics" was first employed in Johnston's Planck's Universe in Light of Modern Physics. These theories, while they fit the experimental facts to some extent, were strictly phenomenological: they provided no rigorous justification for the quantization they employed.

Modern quantum mechanics was born in 1925 with Werner Heisenberg's matrix mechanics and Erwin Schrödinger's wave mechanics and the Schrödinger equation, which was a non-relativistic generalization of de Broglie's(1925) relativistic approach. Schrödinger subsequently showed that these two approaches were equivalent. In 1927, Heisenberg formulated his uncertainty principle, and the Copenhagen interpretation of quantum mechanics began to take shape. Around this time, Paul Dirac, in work culminating in his 1930 monograph finally joined quantum mechanics and special relativity, pioneered the use of operator theory, and devised the bra-ket notation widely used since. In 1932, John von Neumann formulated the rigorous mathematical basis for quantum mechanics as the theory of linear operators on Hilbert spaces. This and other work from the founding period remains valid and widely used.

Quantum chemistry began with Walter Heitler and Fritz London's 1927 quantum account of the covalent bond of the hydrogen molecule. Linus Pauling and others contributed to the subsequent development of quantum chemistry.

The application of quantum mechanics to fields rather than single particles, resulting in what are known as quantum field theories, began in 1927. Early contributors included Dirac, Wolfgang Pauli, Weisskopf, and Jordan. This line of research culminated in the 1940s in the quantum electrodynamics (QED) of Richard Feynman, Freeman Dyson, Julian Schwinger, and Sin-Itiro Tomonaga, for which Feynman, Schwinger and Tomonaga received the 1965 Nobel Prize in Physics. QED, a quantum theory of electrons, positrons, and the electromagnetic field, was the first satisfactory quantum description of a physical field and of the creation and annihilation of quantum particles.

QED involves a covariant and gauge invariant prescription for the calculation of observable quantities. Feynman's mathematical technique, based on his diagrams, initially seemed very different from the field-theoretic, operator-based approach of Schwinger and Tomonaga, but Freeman Dyson later showed that the two approaches were equivalent. The renormalization procedure for eliminating the awkward infinite predictions of quantum field theory was first implemented in QED. Even though renormalization works very well in practice, Feynman was never entirely comfortable with its mathematical validity, even referring to renormalization as a "shell game" and "hocus pocus". (Feynman, 1985: 128)

QED has served as the model and template for all subsequent quantum field theories. One such subsequent theory is quantum chromodynamics, which began in the early 1960s and attained its present form in the 1975 work by H. David Politzer, Sidney Coleman, David Gross and Frank Wilczek. Building on the pioneering work of Schwinger, Peter Higgs, Goldstone, and others, Sheldon Glashow, Steven Weinberg and Abdus Salam independently showed how the weak nuclear force and quantum electrodynamics could be merged into a single electroweak force.

See: https://www.tokenrock.com/explain-quantum-electrodynamics-38.html

I am happily amazed that the old DZero data was combined with the 2018 TOTEM data to find odd numbered clumps of gluons, the so-called messenger particle of the strong nuclear force*, which binds subatomic particles known as quarks within the protons and neutrons of stable matter as well as within heavier, short-lived particles created at high energies. Quarks interact by emitting and absorbing gluons, just as electrically charged particles interact through the emission and absorption of photons.
Hartmann352
 

ASK THE COMMUNITY