How do you make the world’s most powerful neutrino beam?

Neutrinos (and their antimatter counterparts, antineutrinos) are born as other particles decay, carrying away small amounts of energy to balance the cosmic ledger. 

How do you make the world’s most powerful neutrino beam?
The design of the experiment is elegant — produce neutrinos and measure them at Fermilab, send them straight through 1,300 kilometers of earth, then measure them again in giant liquid-argon detectors at Sanford Lab. Credit: Fermilab

NOVEMBER 14, 2019

by Lauren Biron, Fermi National Accelerator Laboratory

What do you need to make the most intense beam of neutrinos in the world? Just a few magnets and some pencil lead. But not your usual household stuff. After all, this is the world’s most intense high-energy neutrino beam, so we’re talking about jumbo-sized parts: magnets the size of park benches and ultrapure rods of graphite as tall as Danny DeVito.

Physics experiments that push the extent of human knowledge tend to work at the extremes: the biggest and smallest scales, the highest intensities. All three are true for the international Deep Underground Neutrino Experiment, hosted by the Department of Energy’s Fermilab. The experiment brings together more than 1,000 people from 30-plus countries to tackle questions that have kept many a person awake at night: Why is the universe full of matter and not antimatter, or no matter at all? Do protons, one of the building blocks of atoms (and of us), ever decay? How do black holes form? And did I leave the stove on?

Maybe not the last one.

To tackle the biggest questions, DUNE will look at mysterious subatomic particles called neutrinos: neutral, wispy wraiths that rarely interact with matter. Because neutrinos are so antisocial, scientists will build enormous particle detectors to catch and study them. More matter inside the DUNE detectors means more things for neutrinos to interact with, and these behemoth neutrino traps will contain a total of 70,000 tons of liquid argon. At their home 1.5 kilometers below the rock in the Sanford Underground Research Facility in South Dakota, they’ll be shielded from interfering cosmic rays—though neutrinos will have no trouble passing through that buffer and hitting their mark. The detectors can pick up neutrinos from exploding stars that might evolve into black holes and capture interactions from a deliberately aimed beam of neutrinos.

Neutrinos (and their antimatter counterparts, antineutrinos) are born as other particles decay, carrying away small amounts of energy to balance the cosmic ledger. You’ll find them coming in droves from stars like our sun, inside Earth, even the potassium in bananas. But if you want to make trillions of high-energy neutrinos every second and send them to a particle detector deep underground, you’d be hard-pressed to do it by throwing fruit toward South Dakota.

That’s where Fermilab’s particle accelerator complex comes in.

Fermilab sends particles through a series of accelerators, each adding a burst of speed and energy. Work has started for an upgrade to the complex that will include a new linear accelerator at the start of the journey: PIP-II. This is the first accelerator project in the United States with major international contributions, and it will propel particles to 84% of the speed of light as they travel about the length of two football fields. Particles then enter the Booster Ring for another … well, boost, and finally head to the Main Injector, Fermilab’s most powerful accelerator.

The twist? Fermilab’s particle accelerators propel protons—useful particles, but not the ones that neutrino scientists want to study.

So how do researchers plan to turn Fermilab’s first megawatt beam of protons into the trillions of high-energy neutrinos they need for DUNE every second? This calls for some extra infrastructure: The Long-Baseline Neutrino Facility, or LBNF. A long baseline means that LBNF will send its neutrinos a long distance—1,300 kilometers, from Fermilab to Sanford Lab—and the neutrino facility means … let’s make some neutrinos.

How do you make the world’s most powerful neutrino beam?
The LBNF beamline will use a one-megawatt capable focusing horn to direct the charged particles that become neutrinos. Credit: Reidar Hahn, Fermilab

Step 1: Grab some protons

The first step is to siphon off particles from the Main Injector—otherwise, the circular accelerator will act more like a merry-go-round. Engineers will need to build and connect a new beamline. That’s no easy feat, considering all the utilities, other beamlines, and Main Injector magnets around.

“It’s in one of the most congested areas of the Fermilab accelerator complex,” said Elaine McCluskey, the LBNF project manager at Fermilab. Site prep work starting at Fermilab in 2019 will move some of the utilities out of the way. Later, when it’s time for the LBNF beamline construction, the accelerator complex will temporarily power down.

Crews will move some of the Main Injector magnets safely out of the way and punch into the accelerator’s enclosure. They’ll construct a new extraction area and beam enclosure, then reinstall the Main Injector magnets with a new Fermilab-built addition: kicker magnets to change the beam’s course. They’ll also build the new LBNF beamline itself, using 24 dipole and 17 quadrupole magnets, most of them built by the Bhabha Atomic Research Center in India.

Step 2: Aim

Neutrinos are tricky particles. Because they are neutral, they can’t be steered by magnetic forces in the same way that charged particles (such as protons) are. Once a neutrino is born, it keeps heading in whatever direction it was going, like a kid riding the world’s longest Slip “N Slide. This property makes neutrinos great cosmic messengers but means an extra step for Earth-bound engineers: aiming.

As they build the LBNF beamline, crews will drape it along the curve of an 18-meter-tall hill. When the protons descend the hill, they’ll be pointed toward the DUNE detectors in South Dakota. Once the neutrinos are born, they’ll continue in that same direction, no tunnel required.

With all the magnets in place and everything sealed up tight, accelerator operators will be able to direct protons down the new beamline, like switching a train on a track. But instead of pulling into a station, the particles will run full speed into a target.

How do you make the world’s most powerful neutrino beam?
DUNE’s far detector will use four modules to capture interactions between argon atoms and the neutrinos sent from the LBNF beamline at Fermilab. Credit: Fermilab

Step 3: Smash things

The target is a crucial piece of engineering. While still being designed, it’s likely to be a 1.5-meter-long rod of pure graphite—think of your pencil lead on steroids.

Together with some other equipment, it will sit inside the target hall, a sealed room filled with gaseous nitrogen. DUNE will start up with a proton beam that will run at more than 1 megawatt of power, and there are already plans to upgrade the beam to 2.4 megawatts. Almost everything being built for LBNF is designed to withstand that higher beam intensity.

Because of the record-breaking beam power, manipulating anything inside the sealed hall will likely require the help of some robot friends controlled from outside the thick walls. Engineers at KEK, the high-energy accelerator research organization in Japan, are working on prototypes for elements of the sealed LBNF target hall design.

The high-power beam of protons will enter the target hall and smash into the graphite like bowling balls hitting pins, depositing their energy and unleashing a spray of new particles—mostly pions and kaons.

“These targets have a very hard life,” said Chris Densham, group leader for high-power targets at STFC’s Rutherford Appleton Laboratory in the UK, which is responsible for the design and production of the target for the one-megawatt beam. “Each proton pulse causes the temperature to jump up by a few hundred degrees in a few microseconds.”

The LBNF target will operate around 500 degrees Celsius in a sort of Goldilocks scenario. Graphite performs well when it’s hot, but not too hot, so engineers will need to remove excess heat. But they can’t let it get too cool, either. Water, which is used in some current target designs, would provide too much cooling, so specialists at RAL are also developing a new method. The current proposed design circulates gaseous helium, which will be moving about 720 kilometers per hour—the speed of a cruising airliner—by the time it exits the system.

Step 4: Focus the debris

As protons strike the target and produce pions and kaons, devices called focusing horns take over. The pions and kaons are electrically charged, and these giant magnets direct the spray back into a focused beam. A series of three horns that will be designed and built at Fermilab will correct the particle paths and aim them at the detectors at Sanford Lab.Credit: Fermi National Accelerator Laboratory

For the design to work, the target—a cylindrical tube—must sit inside the first horn, cantilevered into place from the upstream side. This causes some interesting engineering challenges. It boils down to a balance between what physicists want—a lengthier target that can stay in service for longer—with what engineers can build. The target is only a couple of centimeters in diameter, and every extra centimeter of length makes it more likely to droop under the barrage of protons and the pull of Earth’s gravity.

Much like a game of Operation, physicists don’t want the target to touch the sides of the horn.

To create the focusing field, the metallic horns receive a 300,000-amp electromagnetic pulse about once per second—delivering more charge than a powerful lightning bolt. If you were standing next to it, you’d want to stick your fingers in your ears to block out the noise—and you certainly wouldn’t want anything touching the horns, including graphite. Engineers could support the target from both ends, but that would make the inevitable removal and replacement much more complicated.

“The simpler you can make it, the better,” Densham said. “There’s always a temptation to make something clever and complicated, but we want to make it as dumb as possible, so there’s less to go wrong.”

Step 5: Physics happens

Focused into a beam, the pions and kaons exit the target hall and travel through a 200-meter-long tunnel full of helium. As they do, they decay, giving birth to neutrinos and some particle friends. Researchers can also switch the horns to focus particles with the opposite charge, which will then decay into antineutrinos. Shielding at the end of the tunnel absorbs the extra particles, while the neutrinos or antineutrinos sail on, unperturbed, straight through dirt and rock, toward their South Dakota destiny.

“LBNF is a complex project, with a lot of pieces that have to work together,” said Jonathan Lewis, the LBNF Beamline project manager. “It’s the future of the lab, the future of the field in the United States, and an exciting and challenging project. The prospect of uncovering the properties of neutrinos is exciting science.”

DUNE scientists will examine the neutrino beam at Fermilab just after its production using a sophisticated particle detector on site, placed right in the path of the beam. Most neutrinos will pass straight through the detector, like they do with all matter. But a small fraction will collide with atoms inside the DUNE near-site detector, providing valuable information on the composition of the neutrino beam as well as high-energy neutrino interactions with matter.

Then it’s time to wave farewell to the other neutrinos. Be quick—their 1,300-kilometer journey at close to the speed of light will take four milliseconds, not even close to how long it takes to blink your eye. But for DUNE scientists, the work will be only beginning.

Could the mysteries of antimatter and dark matter be linked?

The BASE group collaborators wondered whether the lack of antimatter might be because it interacts differently with dark matter, and set out to test this. 

antimatter
Credit: CC0 Public Domain

NOVEMBER 13, 2019

by RIKEN

Could the profound mysteries of antimatter and dark matter be linked? Thinking that they might be, scientists from the international BASE collaboration, led by Stefan Ulmer of the RIKEN Cluster for Pioneering Research, and collaborators have performed the first laboratory experiments to determine whether a slightly different way in which matter and antimatter interact with dark matter might be a key to solving both mysteries.

Dark matter and antimatter are both vexing problems for physicists trying to understand how our world works at a fundamental level.

The problem with antimatter is that though the Big Bang should have created equal amounts of matter and antimatter, the observable universe is made only of matter. Antimatter is created every day in experiments and by natural processes such as lightning, but it is quickly annihilated in collisions with regular matter. Predictions show that our understanding of the matter content of the universe is off by nine orders of magnitude, and no one knows why the asymmetry exists.

In the case of dark matter, it is known from astronomical observations that some unknown mass is influencing the orbits of stars in galaxies, but the exact microscopic properties of these particles remains unknown. One theory is that they are a type of hypothetical particle known as an axion, which has an important role in explaining the lack of symmetry violation in the strong interaction in the standard model of particle physics.

The BASE group collaborators wondered whether the lack of antimatter might be because it interacts differently with dark matter, and set out to test this. For the experiment, they used a specially designed device, called a Penning trap, to magnetically trap a single antiproton, preventing it from contacting ordinary matter and being annihilated. They then measured a property of the antiproton called its spin precession frequency. Normally, this should be constant in a given magnetic field, and a modulation of this frequency could be accounted for by an effect mediated by axion-like particles, which are hypothesized dark matter candidates.

First author of the study, Christian Smorra, says, “For the first time, we have explicitly searched for an interaction between dark matter and antimatter, and though we did not find a difference, we set a new upper limit for the potential interaction between dark matter and antimatter.”

Looking to the future, Stefan Ulmer of the RIKEN Cluster for Pioneering Research, who is spokesperson for the BASE Collaboration, says, “From now on, we plan to further improve the accuracy of our measurements of the spin precession frequency of the antiproton, allowing us set more stringent constraints on the fundamental invariance of charge, parity and time, and to make the search for dark matter even more sensitive.”

The work was published in Nature.

What survives, thrives and dominates over a thousand generations? The answer might be even more complex than thought

evolution
Credit: CC0 Public Domain

NOVEMBER 13, 2019

by Harvard University

Aa team of scientists, led by Harvard researchers, has used a new method of DNA “re-barcoding” to track rapid evolution in yeast. The new approach, published in Nature, advances the field of organismic and evolutionary biology and holds promise for real-world results.

The potential impact of the work can be illustrated using the example of flu vaccines. An accurate prediction of what strains of influenza will dominate over the next year is necessary to ensure the vaccines produced are useful. Such prediction relies on tracking evolution.

“We have the sequence of all these flu strains, and we’re watching their evolution. What you should be able to do is look at how they’ve evolved in the past and be able to predict into the future what is going to win and what is going to lose. The problem is, we don’t know how to do that prediction,” explained Michael Desai, Professor of Organismic and Evolutionary Biology (OEB) and of Physics at Harvard.

Desai, in whose lab the study was conducted, said that the questions are basic: “There is this swarm of mutations that are constantly happening,” he said. “How do they battle it out, and what determines who wins?”

“We have been taught that evolution ‘is slow’ and involves the ‘survival of the fittest,’ added Alex N. Nguyen Ba, a post-doctoral fellow in Desai’s lab. “It turns out that molecular evolution doesn’t work that way. It’s actually much faster than how we’ve been taught. This makes evolution way more complex than what has been anticipated.” Nguyen Ba is one of three co-lead authors of the new study, along with Ivana Cvijović and José I. Rojas Echenique

Such evolution has been posited mathematically over the past two decades. However, previous lab experiments have not been able to prove or disprove the theory. Rather, they have only been able to examine the process with high resolution over a short period of time, or with low resolution over a long period of time. Collectively, Desai explained, the paper’s authors—who include Katherine R. Lawrence of MIT and Harvard’s Artur Rego-Costa, along with Xianan Liu of Stanford and Sasha F. Levy of SLAC National Accelerator Laboratory – have done both other kinds of studies.

This new study does both.

“We can identify every single relevant beneficial mutation,” said Nguyen Ba, citing new technology that allowed the research team to follow specific genomes (or lineages) for approximately a thousand generations.

Cvijović, formerly a graduate student in Desai’s lab and now a researcher at Princeton, said the research could have gone on indefinitely: “A thousand generations is about three months of growth in our conditions. That’s enough time to see big changes happening.”

Such in-depth, long-term research was possible because of a technological advance in the methodology that allowed what Nguyen Ba called the “re-barcoding” of DNA.

Using an enzyme to place a marker, the “barcode,” at a specific DNA site, the researchers were able to follow the DNA of yeast through multiple generations. By re-tagging and re-barcoding subsequent generations to record their lineage, the team could then observe how this DNA was transmitted, noting what survived, and what thrived—or came to dominate—as generations passed.

What they discovered included a few surprises.

According to the existing theory, the “fittest” DNA would be that which showed up most frequently in subsequent generations. However, the scientists observed “fluctuations” that the theories could not account for.

“Mutations and genotypes that seem to have fallen behind can leapfrog and dominate,” said Cvijović.

What that means, she says, will be the subject of future research. However, it implies that evolution is, indeed, even more complex than previously thought.

“Our experiment suggests there may be a wide range of a large number of strongly beneficial mutations,” she said. “And their benefits are both very strong and very different from one another.”

Etalumis ‘reverses’ simulations to reveal new science

Scientists already have robust simulation software packages that model the physics and everything that occurs within the detector.

Etalumis 'reverses' simulations to reveal new science
Etalumis performs Bayesian inference–a method of statistical inference in which Bayes’ theorem is used to update the probability for a hypothesis as more evidence or information becomes available–essentially inverting the simulator to predict input parameters from observations. This image provides an overview of the software framework. Credit: Wahid Bhimji, Lawrence Berkeley National Laboratory

NOVEMBER 13, 2019

by Keri Troutman, Lawrence Berkeley National Laboratory

Scientists have built simulations to help explain behavior in the real world, including modeling for disease transmission and prevention, autonomous vehicles, climate science, and in the search for the fundamental secrets of the universe. But how to interpret vast volumes of experimental data in terms of these detailed simulations remains a key challenge. Probabilistic programming offers a solution—essentially reverse-engineering the simulation—but this technique has long been limited due to the need to rewrite the simulation in custom computer languages, plus the intense computing power required.

To address this challenge, a multinational collaboration of researchers using computing resources at Lawrence Berkeley National Laboratory’s National Energy Research Scientific Computing Center (NERSC) has developed the first probabilistic programming framework capable of controlling existing simulators and running at large-scale on HPC platforms. The system, called Etalumis (“simulate” spelled backwards), was developed by a group of scientists from the University of Oxford, University of British Columbia (UBC), Intel, New York University, CERN, and NERSC as part of a Big Data Center project.

Etalumis performs Bayesian inference—a method of statistical inference in which Bayes’ theorem is used to update the probability for a hypothesis as more evidence or information becomes available—essentially inverting the simulator to predict input parameters from observations. The team deployed Etalumis for the first time for the Large Hadron Collider (LHC) at CERN, bringing a new level of interpretability to data analysis from the LHC’s high-energy physics detectors. A paper based on Etalumis has been selected as a finalist for Best Paper at SC19. The authors will speak about Etalumis at SC19 on Tuesday, November 19 at 4:30 p.m.

From Days to Minutes

Bayesian inference is used in virtually all scientific disciplines, according to Frank Wood, an Etalumis collaborator, Associate Professor of Computer Science at UBC, and one of the pioneers of probabilistic programming.

“I was particularly interested in applying Bayesian inference to an extremely complex physics problem, and high-energy physics detectors felt like the perfect proving ground for our group’s seminal research,” he says. “The Etalumis project provided a unique opportunity to combine a cutting-edge neural network based on an ‘inference compilation’ approach with a software framework (pyprob) to directly couple this inference engine to existing detailed particle physics simulators and run it on HPC-scale resources.”

Etalumis 'reverses' simulations to reveal new science
A comparison of some of the predictions from the Etalumis project’s inference compilation approach (outline histograms), which can achieve the same levels of precision as computationally intractable methods (filled histograms). Credit: Lawrence Berkeley National Laboratory

Scientists already have robust simulation software packages that model the physics and everything that occurs within the detector. Etalumis brings in probabilistic programming to couple with this existing software, essentially giving researchers the ability to say “We had this observation; how did we get there?”

“This project is exciting because it makes existing simulators across many fields of science and engineering subject to probabilistic machine learning,” says Atilim Gunes Baydin, lead developer of the Etalumis project and lead author of the SC19 paper. Gunes is currently a postdoctoral researcher in machine learning at the University of Oxford. “This means the simulator is no longer used as a black box to generate synthetic training data, but as an interpretable probabilistic generative model that the simulator’s code already specifies, in which we can perform inference.

“We need to be able to control the program to run down every possibility, so in this project we added this capability as a software layer,” adds Wahid Bhimji, a Big Data Architect in the Data and Analytics Services team at NERSC. However, performing inference in such complex settings brings computational challenges. “Conventional methods for this kind of Bayesian inference are extremely computationally expensive,” Bhimji adds. “Etalumis allows us to do in minutes what would normally take days, using NERSC HPC resources.”

Deep Interpretability

For the LHC use case, the team trained a neural network to perform inference, learning to come up with good proposals about what detailed chain of physics processes from the simulator might have occurred. This required improvements to the PyTorch deep-learning framework to train a complex dynamic neural network on more than 1,000 nodes (32,000 CPU cores) of the Cori supercomputer at NERSC. As a result, training that would take months with the original unoptimized software on a single node can now be completed in less than 10 minutes on Cori. Scientists thus gained an opportunity to study the choices that went into producing each outcome, giving them a greater understanding of the data.

“In many cases you know there’s an uncertainty in determining the physics that occurred at an LHC collision but you don’t know the probabilities of all the processes that could have given rise to a particular observation; with Etalumis, you get a model of that,” Bhimji explains.

The deep interpretability that Etalumis brings to data analysis from the LHC could support major advances in the physics world. “Signs of new physics may well be hiding in the LHC data; revealing those signals may require a paradigm change from the classical algorithmic processing of the data to a more nuanced probabilistic approach,” says Kyle Cranmer, an NYU physicist who was part of the Etalumis project. “This approach takes us to the limit of what is knowable quantum mechanically.”

With Mars methane mystery unsolved, Curiosity serves scientists a new one: Oxygen

NOVEMBER 12, 2019

by Lonnie Shekhtman, NASA’s Goddard Space Flight Center

With Mars methane mystery unsolved, Curiosity serves scientists a new one: Oxygen
Credits: Melissa Trainer/Dan Gallagher/NASA Goddard

For the first time in the history of space exploration, scientists have measured the seasonal changes in the gases that fill the air directly above the surface of Gale Crater on Mars. As a result, they noticed something baffling: oxygen, the gas many Earth creatures use to breathe, behaves in a way that so far scientists cannot explain through any known chemical processes.

Over the course of three Mars years (or nearly six Earth years) an instrument in the Sample Analysis at Mars (SAM) portable chemistry lab inside the belly of NASA’s Curiosity rover inhaled the air of Gale Crater and analyzed its composition. The results SAM spit out confirmed the makeup of the Martian atmosphere at the surface: 95% by volume of carbon dioxide (CO2), 2.6% molecular nitrogen (N2), 1.9% argon (Ar), 0.16% molecular oxygen (O2), and 0.06% carbon monoxide (CO). They also revealed how the molecules in the Martian air mix and circulate with the changes in air pressure throughout the year. These changes are caused when CO2 gas freezes over the poles in the winter, thereby lowering the air pressure across the planet following redistribution of air to maintain pressure equilibrium. When CO2 evaporates in the spring and summer and mixes across Mars, it raises the air pressure.

Within this environment, scientists found that nitrogen and argon follow a predictable seasonal pattern, waxing and waning in concentration in Gale Crater throughout the year relative to how much CO2 is in the air. They expected oxygen to do the same. But it didn’t. Instead, the amount of the gas in the air rose throughout spring and summer by as much as 30%, and then dropped back to levels predicted by known chemistry in fall. This pattern repeated each spring, though the amount of oxygen added to the atmosphere varied, implying that something was producing it and then taking it away.

“The first time we saw that, it was just mind boggling,” said Sushil Atreya, professor of climate and space sciences at the University of Michigan in Ann Arbor. Atreya is a co-author of a paper on this topic published on November 12 in the Journal of Geophysical Research: Planets.

With Mars methane mystery unsolved, curiosity serves scientists a new one: Oxygen
A sunset at the Viking Lander 1 site, 1976. Credit: NASA/JPL

As soon as scientists discovered the oxygen enigma, Mars experts set to work trying to explain it. They first double- and triple-checked the accuracy of the SAM instrument they used to measure the gases: the Quadrupole Mass Spectrometer. The instrument was fine. They considered the possibility that CO2 or water (H2O) molecules could have released oxygen when they broke apart in the atmosphere, leading to the short-lived rise. But it would take five times more water above Mars to produce the extra oxygen, and CO2 breaks up too slowly to generate it over such a short time. What about the oxygen decrease? Could solar radiation have broken up oxygen molecules into two atoms that blew away into space? No, scientists concluded, since it would take at least 10 years for the oxygen to disappear through this process.

“We’re struggling to explain this,” said Melissa Trainer, a planetary scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland who led this research. “The fact that the oxygen behavior isn’t perfectly repeatable every season makes us think that it’s not an issue that has to do with atmospheric dynamics. It has to be some chemical source and sink that we can’t yet account for.”

To scientists who study Mars, the oxygen story is curiously similar to that of methane. Methane is constantly in the air inside Gale Crater in such small quantities (0.00000004% on average) that it’s barely discernable even by the most sensitive instruments on Mars. Still, it’s been measured by SAM’s Tunable Laser Spectrometer. The instrument revealed that while methane rises and falls seasonally, it increases in abundance by about 60% in summer months for inexplicable reasons. (In fact, methane also spikes randomly and dramatically. Scientists are trying to figure out why.)

With the new oxygen findings in hand, Trainer’s team is wondering if chemistry similar to what’s driving methane’s natural seasonal variations may also drive oxygen’s. At least occasionally, the two gases appear to fluctuate in tandem.

With Mars methane mystery unsolved, Curiosity serves scientists a new one: Oxygen
Credit: Melissa Trainer/Dan Gallagher/NASA Goddard

“We’re beginning to see this tantalizing correlation between methane and oxygen for a good part of the Mars year,” Atreya said. “I think there’s something to it. I just don’t have the answers yet. Nobody does.”

Oxygen and methane can be produced both biologically (from microbes, for instance) and abiotically (from chemistry related to water and rocks). Scientists are considering all options, although they don’t have any convincing evidence of biological activity on Mars. Curiosity doesn’t have instruments that can definitively say whether the source of the methane or oxygen on Mars is biological or geological. Scientists expect that non-biological explanations are more likely and are working diligently to fully understand them.

Trainer’s team considered Martian soil as a source of the extra springtime oxygen. After all, it’s known to be rich in the element, in the form of compounds such as hydrogen peroxide and perchlorates. One experiment on the Viking landers showed decades ago that heat and humidity could release oxygen from Martian soil. But that experiment took place in conditions quite different from the Martian spring environment, and it doesn’t explain the oxygen drop, among other problems. Other possible explanations also don’t quite add up for now. For example, high-energy radiation of the soil could produce extra O2 in the air, but it would take a million years to accumulate enough oxygen in the soil to account for the boost measured in only one spring, the researchers report in their paper.

“We have not been able to come up with one process yet that produces the amount of oxygen we need, but we think it has to be something in the surface soil that changes seasonally because there aren’t enough available oxygen atoms in the atmosphere to create the behavior we see,” said Timothy McConnochie, assistant research scientist at the University of Maryland in College Park and another co-author of the paper.

With Mars methane mystery unsolved, Curiosity serves scientists a new one: Oxygen
Credit: Melissa Trainer/Dan Gallagher/NASA Goddard

The only previous spacecraft with instruments capable of measuring the composition of the Martian air near the ground were NASA’s twin Viking landers, which arrived on the planet in 1976. The Viking experiments covered only a few Martian days, though, so they couldn’t reveal seasonal patterns of the different gases. The new SAM measurements are the first to do so. The SAM team will continue to measure atmospheric gases so scientists can gather more detailed data throughout each season. In the meantime, Trainer and her team hope that other Mars experts will work to solve the oxygen mystery.

“This is the first time where we’re seeing this interesting behavior over multiple years. We don’t totally understand it,” Trainer said. “For me, this is an open call to all the smart people out there who are interested in this: See what you can come up with.”

Massive photons in an artificial magnetic field

During the research, conducted at the University of Warsaw, the unique behavior of photons trapped in the cavity was found as they behaved like mass-bearing quasiparticles. 

Massive photons in an artificial magnetic field
The dependence of the energy (vertical axis) on the angle for polarized light reflected from birefringent optical cavity (horizontal axis). Credit: M. Krol, UW Physics

NOVEMBER 12, 2019

by University of Warsaw

An international research collaboration from Poland, the UK and Russia has created a two-dimensional system—a thin optical cavity filled with liquid crystal—in which they trapped photons. As the properties of the cavity were modified by an external voltage, the photons behaved like massive quasiparticles endowed with a magnetic moment, called “spin,” under the influence of an artificial magnetic field. The research has been published in Science on Friday, 8 November 2019.

The world around us has one temporal and three spatial dimensions. Physicists studying condensed matter have long been dealing with systems of lower dimensionality—two-dimensional (2-D) quantum wells, one-dimensional (1-D) quantum wires and zero-dimensional (0-D) quantum dots. 2-D systems have found the widest technical applications—it is thanks to the reduced dimensions that efficient LEDs and laser diodes, fast transistors in integrated circuits, and WiFi radio amplifiers operate. Trapped electrons in two dimensions can behave completely differently than free electrons. For example, in graphene, a two-dimensional carbon structure with honeycomb symmetry, electrons behave like massless objects , i.e. light particles called photons.

Electrons in a crystal interact with each other and with the crystal lattice, creating a complex system whose description is possible thanks to the introduction of the concept of so-called quasiparticles. Properties of these quasiparticles, including electric charge, magnetic moment and mass, depend on the symmetry of the crystal and its spatial dimension. Physicists can create materials with reduced dimensions, discovering “quasi-universes” full of exotic quasiparticles. The massless electron in two-dimensional graphene is such an example.

Massive photons in an artificial magnetic field
Tomography of circularly polarized light reflected from an optical cavity filled with liquid crystal. Credit: M. Krol, UW Physics

These discoveries inspired researchers from the University of Warsaw, the Polish Military University of Technology, the Institute of Physics of the Polish Academy of Sciences, the University of Southampton and the Skolkovo Institute near Moscow, to study light trapped in two-dimensional structures—optical cavities.

The authors of the Science paper created an optical cavity in which they trapped photons between two mirrors. The original idea was to fill the cavity with a liquid crystal material that acts as an optical medium. Under the influence of an external voltage, molecules of this medium can rotate and change the optical path length. Because of this, it was possible to create standing waves of light in the cavity, whose energy (frequency of vibrations) was different when the electric field of the wave (polarization) was directed across the molecules and different for polarization along their axis (this phenomenon is called optical anisotropy).

During the research, conducted at the University of Warsaw, the unique behavior of photons trapped in the cavity was found as they behaved like mass-bearing quasiparticles. Such quasiparticles have been observed before, but they were difficult to manipulate because the light does not react to electric or magnetic fields. This time, it was noted that as the optical anisotropy of the liquid crystal material in the cavity was changed, the trapped photons behaved like quasiparticles endowed with a magnetic moment, or a “spin” in an “artificial magnetic field.” Polarization of the electromagnetic wave played the role of “spin” for light in the cavity. The behavior of light in this system is easiest to explain using the analogy of the behavior of electrons in condensed matter.

Massive photons in an artificial magnetic field
The scheme of the experiment – circular polarization of light (marked in red and blue) transmitted through a cavity filled with liquid crystal depending on the direction of propagation. Credit: M. Krol, UW Physics

The equations describing the motion of photons trapped in the cavity resemble the equations of motion of electrons with spin. Therefore, it was possible to build a photonic system that perfectly imitates electronic properties and leads to many surprising physical effects such as topological states of light.

The discovery of new phenomena related to the entrapment of light in optically anisotropic cavities may enable the implementation of new optoelectronic devices, e.g. optical neural networks and perform neuromorphic calculations. There is particular promise to the prospect of creating a unique quantum state of matter—the Bose Einstein condensate. Such a condensate can be used for quantum calculations and simulations, solving problems that are too difficult for modern computers. The studied phenomena will open up new possibilities for technical solutions and further scientific discoveries.

New spin directions in pyrite an encouraging sign for future spintronics

NOVEMBER 12, 2019

by FLEET

New spin directions in pyrite an encouraging sign for future spintronics
Crystal structure of Pyrite OsSe2/OsTe2. Credit: FLEET

A Monash University study revealing new spin textures in pyrite could unlock these materials’ potential in future spintronics devices.

The study of pyrite-type materials provides new insights and opportunities for selective spin control in topological spintronics devices.

Seeking new spin in topological materials

Topological materials have exciting potential for next-generation, ultra-low energy electronics, including thermoelectric and spintronic devices.

However, a restriction on the use of such materials in spintronics has been that all topological materials studied thus far have spin states that lie parallel to the plane of the material, while many/most/all practical spintronic devices would require out-of-plane spin states.

Generating and manipulating out-of-plane spins without applying an external electric or magnetic field has been a key challenge in spintronics.

The new Monash Engineering study demonstrates for the first time that pyrite-type crystals can host unconventional energy- and direction-dependent spin textures on the surface, with both in-plane and out-of-plane spin components, in sharp contrast to spin textures in conventional topological materials.

“A number of pyrite-type materials have previously been theoretically predicted to show the desired out-of-plan spin textures,” explains lead author Dr. Yuefeng Yin, in Monash Engineering’s Computational Materials Lab.

Pyrite (colloquially known as ‘fool’s gold’) is an iron-sulfide mineral that displays multiple internal planes of electronic symmetry.

“The presence of strong local symmetry protects out-of-plan spin states,” explains Yuefeng, “so we decided to look closer at some of these crystals.”

The unconventional spin texture discovered opens new possibilities for the necessary task of injecting or detecting out-of-plane spin component in future topological spintronic devices.

The study

Selective control of surface spin current in topological pyrite-type OsX2 (X = Se, Te) crystals was published in NPJ Quantum Materials in August 2019.

Using first-principles calculations, the Monash team separated surface spin states by their interactions with spin states in the bulk of the material, resulting in highly anisotropic but tunable behaviour.

As well as funding from the Australian Research Council (Centre of Excellence and ARC Laureates funding) the authors gratefully acknowledge computational support from the Monash Campus Cluster, NCI computational facility and Pawsey Supercomputing Facility.

The link between symmetry and topological materials

The presence of strong, local symmetry provides topological robustness to spin states, and symmetry is therefore a strong predictor of topological behaviour, so that studying these phenomena in pyrite crystals should provide clues towards discovery of many other new topological materials.

Topological insulators are novel materials that behave as electrical insulators in their interior, but can carry a current along their edges. Unlike a conventional electrical path, such topological edge paths can carry electrical current with near-zero dissipation of energy, meaning that topological transistors can switch without burning energy.

Study investigates a critical transition in water that remains liquid far below the freezing point

Study investigates a critical transition in water that remains liquid far below 0 °C
The theoretical model proposed by Brazilian researchers can be applied to any system in which two energy scales coexist. Credit: Miguel Boyayan/Revista Pesquisa FAPESP

NOVEMBER 12, 2019

by José Tadeu Arantes, FAPESP

Water can remain liquid at temperatures far below 0 degrees Celsius. This supercooled phase is a current focus for scientific research. A theoretical model developed at São Paulo State University (UNESP) in Brazil shows that in supercooled water, there is a critical point at which properties such as thermal expansion and compressibility exhibit anomalous behavior.

Led by Mariano de Souza, a professor in the Physics Department of UNESP’s Institute of Geosciences and Exact Sciences at Rio Claro, the study was supported by FAPESP. An article by Souza and collaborators describing the study has been published in Scientific Reports.

“Our study shows that this second critical point is analogous to the liquid-gas transition in water at about 374 degrees Celsius and at a pressure of some 22 megapascals,” Souza told.

Liquid and gas phases coexist in water at approximately 374 degrees Celsius. The genesis of this exotic behavior can be observed, for example, in a pressure cooker. At this point, water’s thermodynamic properties begin to display anomalous behavior. For this reason, the point is considered “critical.”

In the case of supercooled water, two phases also coexist, but both are liquid. One is more dense and the other less dense. If the system continues to be cooled appropriately below 0 degrees Celsius, there comes a point on the phase diagram where the stability of the two phases breaks down, and the water starts to crystallize. This is the second critical point, determined theoretically by the recent study.

Study investigates a critical transition in water that remains liquid far below 0 C
Credit: FAPESP

“The study shows that this second critical point occurs in the range of 180 kelvins [approximately -93 degrees Celsius]. Above this point, liquid water can exist. It’s called supercooled water,” Souza said.

“The most interesting part is that the theoretical model we developed for water can be applied to all systems in which two energy scales coexist. For example, it applies to an iron-based superconductor system in which there is also a nematic phase [with molecules oriented in parallel lines but not arranged in well-defined planes]. This theoretical model originated in several experiments with thermal expansion at low temperatures performed in our research laboratory.”

This universal model was obtained by means of a theoretical refinement of the Grüneisen parameter, named for German physicist Eduard Grüneisen (1877-1949). Simply put, this parameter describes the effects of variations in temperature and pressure on a crystal lattice.

“Our analysis of the Grüneisen and pseudo-Grüneisen parameters can be applied to an investigation of critical behavior in any system with two energy scales. It suffices to make appropriate adjustments to the critical parameters in accordance with the system of interest,” Souza said.

LHCf gears up to probe birth of cosmic-ray showers

What’s more, the experiment plans to measure forward particles emitted from collisions of protons with light ions, most likely oxygen ions. 

LHCf gears up to probe birth of cosmic-ray showers
One of the LHCf experiment’s two detectors, LHCf Arm2, seen here during installation into a particle absorber that surrounds the LHC’s beam pipe. Credit: Lorenzo Bonechi

NOVEMBER 11, 2019

by Ana Lopes, CERN

Cosmic rays are particles from outer space, typically protons, travelling at almost the speed of light. When the most energetic of these particles strike the atmosphere of our planet, they interact with atomic nuclei in the atmosphere and produce cascades of secondary particles that shower down to the Earth’s surface. These extensive air showers, as they are known, are similar to the cascades of particles that are created in collisions inside particle colliders such as CERN’s Large Hadron Collider (LHC). In the next LHC, run starting in 2021, the smallest of the LHC experiments—the LHCf experiment—is set to probe the first interaction that triggers these cosmic showers.

Observations of extensive air showers are generally interpreted using computer simulations that involve a model of how cosmic rays interact with atomic nuclei in the atmosphere. But different models exist and it’s unclear which one is the most appropriate. The LHCf experiment is in an ideal position to test these models and help shed light on cosmic-ray interactions.

In contrast to the main LHC experiments, which measure particles emitted at large angles from the collision line, the LHCf experiment measures particles that fly out in the “forward” direction, that is, at small angles from the collision line. These particles, which carry a large portion of the collision energy, can be used to probe the small angles and high energies at which the predictions from the different models don’t match.

Using data from proton–proton LHC collisions at an energy of 13 TeV, LHCf has recently measured how the number of forward photons and neutrons varies with particle energy at previously unexplored high energies. These measurements agree better with some models than others, and they are being factored in by modellers of extensive air showers.

In the next LHC run, LHCf should extend the range of particle energies probed, due to the planned higher collision energy. In addition, and thanks to ongoing upgrade work, the experiment should also increase the number and type of particles that are detected and studied.

What’s more, the experiment plans to measure forward particles emitted from collisions of protons with light ions, most likely oxygen ions. The first interactions that trigger extensive air showers in the atmosphere involve mainly light atomic nuclei such as oxygen and nitrogen. LHCf could therefore probe such an interaction in the next run, casting new light on cosmic-ray interaction models at high energies.

A distinct spin on atomic transport

NOVEMBER 11, 2019

by ETH Zurich

A distinct spin on atomic transport
An optical beam (red) introduces an effect equivalent to applying a magnetic field inside an optically defined structure in which the atoms move (green). Atoms in the energetically lower spin state (orange) can flow while atoms in a higher spin state (blue) are blocked. Credit: ETH Zurich/D-PHYS, adapted from doi: 10.1103/PhysRevLett.123.193605

One of the more unexpected things that can be done with charge-neutral atoms is using them to emulate the fundamental behavior of electrons. Over the past few years, the group of Tilman Esslinger at the Institute of Quantum Electronics in the Department of Physics of ETH Zurich has pioneered a platform in which atoms cooled to temperatures close to absolute zero are transported through one- and two-dimensional structures, driven by a potential difference. In this way, defining phenomena occuring in mesoscopic electronic systems can be studied in great detail, including quantized conductance. In a pair of papers published today in Physical Review Letters and Physical Review A, postdoc Laura Corman, former Ph.D. student Martin Lebrat and colleagues in the Esslinger group report that they have mastered in their transport experiments control over quantum spin.

The team added a tightly focused light beam to the transport channel that induces local interactions equivalent to exposing the atoms to a strong magnetic field. As a consequence, the degeneracy of the spin states is lifted, which in turn serves as the basis for an efficient spin filter: Atoms of one spin orientation are repelled, whereas those of another orientation are free to pass (see the figure). Importantly, even though the application of an additional light field leads to the loss of atoms, these dissipative processes do not destroy the quantization of conductance. The ETH researchers replicate this experimental finding in numerical simulation and substantiate its validity through an extension of the Landauer-Büttiker model, the key formalism for quantum transport.

The efficiency of the atomic spin filter demonstrated by the Esslinger group matches that of the best equivalent elements for electronic systems. This, together with the extraordinary cleanness and controllability of the cold-atom platform, opens up exciting new perspectives for exploring the dynamics of quantum transport. In particular, as the interaction between the atoms can be tuned, the platform provides access to spin transport of strongly correlated quantum systems. This regime is difficult to study otherwise, but is of considerable fundamental and practical interest, not least for applications in spintronic devices and to explore fundamental phases of matter.