Roughly 14 billion years ago, the Big Bang created the universe, the existence of time, solar systems like Earth’s, and, eventually, life itself.
The universe has been expanding ever since, and with it, humanity’s curiosity about why we exist and how the Big Bang happened. That thirst is driving the galaxy’s leading physicists to hole up in high-tech research facilities and smash sub-atomic particles over and over. They hope that measuring the smaller bits of matter created by the collisions – the building blocks of all matter – will give them a glimpse into new truths about the universe.
Micron Technology will play a role in this effort, lending its most advanced deep learning and memory solutions to physicists fervently chasing the next breakthrough.
The collaboration could help yield results that change science forever.
The physics side of the collaboration is the European Organization for Nuclear Research, known as CERN. Founded in 1954, the laboratory hosts half of the world’s particle physicists and is best known for being home to the Large Hadron Collider (LHC), the world’s largest particle accelerator. Experiments running on the LHC confirmed the existence of the Higgs boson particle in 2012. The discovery of the “God particle” (as it’s been called in mainstream media, though our friends at CERN are less keen on the phrase) was a landmark event in the scientific community and earned the researchers who proposed its existence back in the 1960s the Nobel Prize for physics the following year.
The chance to contribute to CERN’s research is very exciting, says Mark Hur, Micron Director of Operations of Advanced Computing Solutions. CERN experiments generated prodigious volume of data to filter, process and store. Micron is contributing hardware for several experiments that will help aid CERN in solving their data dilemma.
“CERN is at the cutting edge of trying to apply new technologies to help understand our universe,” Hur says. “The work with CERN allows us to test out new technology in an incredibly demanding environment.”
Tiny Collisions
CERN’s LHC is itself a physics marvel. In simple terms, the LHC fires millions of Hydrogen nuclei at each other through a huge magnetic tube and documents what happens when they smack into each other. These collisions create conditions similar to those just after the Big Bang. Maurizio Pierini, CERN particle physicist explained that it's as if you were launching 100 billion tennis balls at near the speed of light from one side of the galaxy and another 100 billion tennis balls from the other and seeing what happens when (and if) they collide in the middle.
Here’s how it works. The LHC is housed in a 17-mile loop beneath the ground and crossing the France-Switzerland border. Hydrogen atoms are stripped of their electrons, forming protons fed into the accelerator.
Two high-energy particle beams shoot groups of protons through tubes in opposite directions, propelled by electromagnets chilled by liquid helium to minus 456 degrees Fahrenheit. The super cold environment enables the magnets to operate in a superconducting state, conducting electricity without losing energy.
The magnets steer the protons around the 17-mile loop. Strong radio frequencies are used to accelerate the protons to nearly the speed of light, shooting them around the LHC’s loop about 11,000 times per second. The particle beams intersect at four collision points on the accelerator – the sites of the LHC’s four main particle detectors.
The radio frequencies also cause the protons to travel in bunches about 12 inches long and 1 millimeter wide. Protons are so tiny that most pass through the collision points untouched. Still, the sheer number of protons produces up to 1 billion collisions per second.
That’s 1 billion collisions producing data that must be captured and processed, every second.
Processing this data requires immense computational power. And soon the scientists are going to up the ante by increasing the beam intensity to increase the odds of getting more head on collisions. Back to the tennis ball analogy; it’s as if you group the tennis balls closer together, making a denser cluster of balls increases the chance of a head on collision when both groups meet.
The High-Luminosity LHC, as CERN’s upgraded accelerator is known, will ratchet up the intensity of its particle beams by a factor of five when it comes online around 2026. The technological demands will be staggering. Each collision produces 1 megabyte of data. Today, all the LHC experiments generate around 1 petabyte of data per second, or the equivalent of more than 2,000 years of music.
“The data will become bigger, more crowded, and more complex,” Pierini says. “Doing this real-time processing will be a huge challenge.”
Data Firehose
Scientists working at CERN are looking for leading-edge technologies that can support their experiments’ computing and data-processing requirements. Memory plays a vital role by processing vast amounts of data generated by experiments and helping researchers gain valuable insights from that data.
That’s where the Micron SB-852 board comes in. The board, powered by 512 gigabytes of top-of-the-line DDR4 DRAM and 2 gigabytes of Hybrid Memory Cube, is being tested as a means to further machine-learning capabilities at the Compact Muon Solenoid (CMS), one of the four main LHC experiments. Micron’s memory solutions based on neural network capabilities will be tested in the data-acquisition systems of the experiment
In non-technical terms, “it’s a very geeky board,” Hur says. The SB-852 board provides the bit crunching muscle to consume the data, identify that which is important or interesting to the scientist, and filter out the rest.
“The board being able to ingest a tremendous amount of data once these collisions happen, then the machine learning that runs inside will leverage the memory to call out, “Hey, this is something we haven’t seen before. We should focus on this,’” he says.
Deep Learning
It is not feasible to analyze every collision in the LHC. They are so frequent and big that the data recording systems would choke, says Pierini.
Most collisions produce smaller particles already well understood. The trick, Pierini says, is to discard the useless data. For that, CERN relies on algorithms that read particle trajectories and dump uninteresting data.
CERN, which has likely built the world’s most accurate predictive models for particle collisions, applies neural networks to digest the data and filter out all but a fraction. Until now, CERN has relied on predictive models to anticipate how the subatomic particles that they expect to find will behave and trained neural networks to find that expected behavior.
“This is like distinguishing cats from dogs in images, and neural networks are very good at it,” Pierini says.
But some of the greatest scientific revelations sprang from unexpected results, like how detecting cosmic microwave background radiation in 1964 provided crucial evidence for the Big Bang theory. CERN researchers are keen to avoid leaving something interesting on the cutting room floor.
As such, CERN researchers need more advanced artificial intelligence that can pick unusual events from the data. The Micron boards could help do just that at the CMS experiment, highlighting strange data that could yield unforeseen results.
“We are developing an algorithm to learn the Standard Model, and by inference tell the one event in a million that we would have thrown away that should actually be kept because it’s a weird event,” Pierini says.
Hunting the Ghost Particle
Another promising but enigmatic particle tantalizes particle-physics researchers: the neutrino, which is similar to an electron, however, it has no electric charge, almost zero mass, and rarely reacts with normal matter, making them especially hard to observe. Through its collaboration with CERN, Micron is also contributing to a separate major experiment to detect neutrinos. This experiment will be built in the United States, hosted by Fermi National Accelerator Laboratory (Fermilab).
Known as the “ghost particle,” neutrinos remain one of the smallest and elusive known particles despite being among the most abundant. Physicists know neutrinos exist but understand little about their behavior. Breakthroughs in neutrino research might solve some of science’s greatest questions about the formation of the universe, including helping to explain how matter formed after the Big Bang.
In short, unlocking the mysteries of the neutrino might help explain why we’re here.
Neutrinos fit into what physicists call the standard model, which describes the basic building blocks of matter. However, they were believed to contain no mass until a recent Nobel Prize winning discovery proved otherwise, raising questions about what other mysteries can be solved by better understanding the ghost particle.
For physicists, that’s thrilling.
“Neutrinos are an elusive, exotic particle that could be hiding secrets,” Pierini says. “There is this intuition that the neutrino mass has to come from some new physics because, otherwise, it cannot be really explained.”
CERN is part of an international consortium working to create the largest ever neutrino detection project, called the Deep Underground Neutrino Experiment, or DUNE. The experiment will include two neutrino detectors in the U.S., one in Lead, South Dakota. There, using the shafts of an abandoned gold mine, crews will excavate 800,000 tons of rock to make room for a massive chamber filled with more than 40,000 tons of liquid argon.
A particle generator 800 miles away at Fermilab outside Chicago will shoot a beam of neutrinos through the earth to the detection chamber, where precision technology will map neutrinos’ paths through the chamber. images produced by the experiment will give insights into neutrino behavior. It’s a massive collaboration including more than 1,000 researchers hailing from 175 research institutions from across the world, working with DUNE to unlock the puzzle of the neutrino.
Tracking Neutrinos
DUNE presents a different data challenge than the LHC experiments. Whereas the particle collisions in the LHC produce so much data that they need to be filtered, neutrinos rarely interact with matter, meaning their detection inside the cavernous chambers of DUNE will therefore be much more rare.
The challenge, Pierini says, becomes compressing and storing every byte of data produced by each neutrino detected by the chamber’s 3D array of sensors. The same Micron SB-852 boards are being tested in a prototype of these chambers built at CERN, as a means to provide the necessary computational power. Their neural networks will also extrapolate data to figure out what else can be gleaned and to help identify decaying neutrinos.
“[The goal is to] develop a fast data processing algorithm that maybe starts regionally and then looks at the event globally,” Pierini says.
Ground breaking took place in South Dakota last year, with plans calling for the DUNE experiment to begin operating in 2026.
Perhaps after 14 billion years the quandary of the neutrino will give up the ghost.
Ideal Partnership
Teaming up with Micron is part of CERN’s proud tradition of collaboration.
CERN shares its experiment data through its Open Data portal. Today, more than 1 petabyte of information contained in datasets, software, environments and scientific documentation are available for researchers to access and use as they see fit. CERN shows its dedication to open source data and software by building its systems so that its data can proliferate as widely as possible.
With a similar ethic in mind, CERN collaborates with other top-of-their field research institutions and technology companies through a public-private platform called CERN openlab.
In addition to Micron, CERN openlab’s collaborators include 10 of the world’s foremost technology companies and nine leading research organizations.
“CERN collaborates openly with both the public and private sector, and working with technology partners like Micron helps ensure that members of the research community have access to the advanced computing technologies needed to carry out our groundbreaking work,” says Maria Girone, CTO at CERN openlab.
The collaboration with CERN reflects the increasing importance in memory within science and the greater technology community, says Hur.
“When I look back, memory was always an afterthought. The server and processor were the most important factor,” Hur says. “Now, everybody is realizing that servers are hitting the clock speed wall. Now, all of these applications are becoming memory determinant.”