August 1, 2018

Particle physicists team up with AI to solve toughest science problems

Researchers from SLAC and around the world increasingly use machine learning to handle Big Data produced in modern experiments and to study some of the most fundamental properties of the universe.

By Manuel Gnida

Experiments at the Large Hadron Collider (LHC), the world’s largest particle accelerator at the European particle physics lab CERN, produce about a million gigabytes of data every second. Even after reduction and compression, the data amassed in just one hour is similar to the data volume Facebook collects in an entire year – too much to store and analyze.

Luckily, particle physicists don’t have to deal with all of that data all by themselves. They partner with a form of artificial intelligence called machine learning that learns how to do complex analyses on its own.

A group of researchers, including scientists at the Department of Energy’s SLAC National Accelerator Laboratory and Fermi National Accelerator Laboratory, summarize current applications and future prospects of machine learning in particle physics in a paper published today in Nature.

“Compared to a traditional computer algorithm that we design to do a specific analysis, we design a machine learning algorithm to figure out for itself how to do various analyses, potentially saving us countless hours of design and analysis work,” says co-author Alexander Radovic from the College of William & Mary, who works on the NOvA neutrino experiment.

Sifting through big data

To handle the gigantic data volumes produced in modern experiments like the ones at the LHC, researchers apply what they call “triggers” – dedicated hardware and software that decide in real time which data to keep for analysis and which data to toss out.

In LHCb, an experiment that could shed light on why there is so much more matter than antimatter in the universe, machine learning algorithms make at least 70 percent of these decisions, says LHCb scientist Mike Williams from the Massachusetts Institute of Technology, one of the authors of the Nature summary. “Machine learning plays a role in almost all data aspects of the experiment, from triggers to the analysis of the remaining data,” he says.

Machine learning has proven extremely successful in the area of analysis. The gigantic ATLAS and CMS detectors at the LHC, which enabled the discovery of the Higgs boson, each have millions of sensing elements whose signals need to be put together to obtain meaningful results.

“These signals make up a complex data space,” says Michael Kagan from SLAC, who works on ATLAS and was also an author on the Nature review. “We need to understand the relationship between them to come up with conclusions, for example that a certain particle track in the detector was produced by an electron, a photon or something else.”

Neutrino experiments also benefit from machine learning. NOvA, which is managed by Fermilab, studies how neutrinos change from one type to another as they travel through the Earth. These neutrino oscillations could potentially reveal the existence of a new neutrino type that some theories predict to be a particle of dark matter. NOvA’s detectors are watching out for charged particles produced when neutrinos hit the detector material, and machine learning algorithms identify them.

From machine learning to deep learning

Recent developments in machine learning, often called “deep learning,” promise to take applications in particle physics even further. Deep learning typically refers to the use of neural networks: computer algorithms with an architecture inspired by the dense network of neurons in the human brain.

These neural nets learn on their own how to perform certain analysis tasks during a training period in which they are shown sample data, such as simulations, and told how well they performed.

Until recently, the success of neural nets was limited because training them used to be very hard, says co-author Kazuhiro Terao, a SLAC researcher working on the MicroBooNE neutrino experiment, which studies neutrino oscillations as part of Fermilab’s short-baseline neutrino program and will become a component of the future Deep Underground Neutrino Experiment (DUNE) at the Long-Baseline Neutrino Facility (LBNF). “These difficulties limited us to neural networks that were only a couple of layers deep,” he says. “Thanks to advances in algorithms and computing hardware, we now know much better how to build and train more capable networks hundreds or thousands of layers deep.”

Many of the advances in deep learning are driven by tech giants’ commercial applications and the data explosion they have generated over the past two decades. “NOvA, for example, uses a neural network inspired by the architecture of the GoogleNet,” Radovic says. “It improved the experiment in ways that otherwise could have only been achieved by collecting 30 percent more data.”           

A fertile ground for innovation

Machine learning algorithms become more sophisticated and fine-tuned day by day, opening up unprecedented opportunities to solve particle physics problems.

Many of the new tasks they could be used for are related to computer vision, Kagan says. “It’s similar to facial recognition, except that in particle physics, image features are more abstract than ears and noses.”

Some experiments like NOvA and MicroBooNE produce data that is easily translated into actual images, and AI can be readily used to identify features in them. In LHC experiments, on the other hand, images first need to be reconstructed from a murky pool of data generated by millions of sensor elements.

“But even if the data don’t look like images, we can still use computer vision methods if we’re able to process the data in the right way,” Radovic says.

One area where this approach could be very useful is the analysis of particle jets produced in large numbers at the LHC. Jets are narrow sprays of particles whose individual tracks are extremely challenging to separate. Computer vision technology could help identify features in jets.

Another emerging application of deep learning is the simulation of particle physics data that predict, for example, what happens in particle collisions at the LHC and can be compared to the actual data. Simulations like these are typically slow and require immense computing power. AI, on the other hand, could do simulations much faster, potentially complementing the traditional approach.

“Just a few years ago, nobody would have thought that deep neural networks can be trained to ‘hallucinate’ data from random noise,” Kagan says. “Although this is very early work, it shows a lot of promise and may help with the data challenges of the future.”

Benefitting from healthy skepticism

Despite all obvious advances, machine learning enthusiasts frequently face skepticism from their collaboration partners, in part because machine learning algorithms mostly work like “black boxes” that provide very little information about how they reached a certain conclusion.

“Skepticism is very healthy,” Williams says. “If you use machine learning for triggers that discard data, like we do in LHCb, then you want to be extremely cautious and set the bar very high.”

Therefore, establishing machine learning in particle physics requires constant efforts to better understand the inner workings of the algorithms and to do cross-checks with real data whenever possible.

“We should always try to understand what a computer algorithm does and always evaluate its outcome,” Terao says. “This is true for every algorithm, not only machine learning. So, being skeptical shouldn’t stop progress.”

Rapid progress has some researchers dreaming of what could become possible in the near future. “Today we’re using machine learning mostly to find features in our data that can help us answer some of our questions,” Terao says. “Ten years from now, machine learning algorithms may be able to ask their own questions independently and recognize when they find new physics.”

Other co-authors of the article are David Rousseau from the University of Paris-Saclay, France; Daniele Bonacorsi from the University of Bologna and the National Institute for Nuclear Physics (INFN), Italy; Alexander Himmel from Fermilab; Adam Aurisano from Cincinnati University; and Taritree Wongjirad from Tufts University.                          


Citation: A. Radovic, M. Williams, et al., Nature, August 1, 2018 (10.1038/s41586-018-0361-2)

For questions or comments, contact the SLAC Office of Communications at communications@slac.stanford.edu.


SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, Calif., SLAC is operated by Stanford University for the U.S. Department of Energy's Office of Science.

SLAC National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Machine Learning in HEP
Researchers from SLAC and around the world increasingly use machine learning to handle Big Data produced in modern experiments and to study some of the most fundamental properties of the universe (Symmetry magazine).
Dig Deeper

Related stories

News Feature

In 1974, the independent discovery of the J/psi particle at SLAC and Brookhaven National Laboratory rocked the physics world, and entire textbooks had to...

50th anniversary of the J/psi discovery
News Brief

The observatory's practice camera has captured its first on-sky data.

A telescope pointed through open doors in its building's roof.
News Feature

SLAC hosted two faculty members from institutions historically underrepresented in the research community via the Visiting Faculty Program.

Fred Lacy and Kolo Wamba stand in front experimental equipment.
News Feature

In 1974, the independent discovery of the J/psi particle at SLAC and Brookhaven National Laboratory rocked the physics world, and entire textbooks had to...

50th anniversary of the J/psi discovery
News Brief

The observatory's practice camera has captured its first on-sky data.

A telescope pointed through open doors in its building's roof.
News Feature

SLAC hosted two faculty members from institutions historically underrepresented in the research community via the Visiting Faculty Program.

Fred Lacy and Kolo Wamba stand in front experimental equipment.
News Brief

This research advances our understanding of Earth's deep interior and exoplanets, opening new research avenues in Earth and planetary sciences.

mec_super_earth
News Feature

The prototype DUNE 2x2 detector will capture up to 10,000 neutrino interactions per day.

Two people in blue helmets examine experimental equipment.
News Feature

Vera C. Rubin Observatory will unite coordinated observations of cosmic phenomena using the four messengers of the universe.

Two stars collide, sending particles to earth.