Increase computing power for the future of particle physics



[ad_1]

A new machine learning technology tested by an international team of scientists, including MIT Assistant Professor Philip Harris and Dylan Rankin, a postdoctoral researcher at the Laboratory for Nuclear Science, enables the rapid detection of specific particle signatures in a laboratory. data ocean of the Large Hadron Collider (LHC). from one eye.

Sophisticated and fast, the new system provides insight into the critical role that machine learning will play in future discoveries in particle physics as data sets become larger and more complex.

The LHC generates about 40 million collisions per second. With such large amounts of data to filter, powerful computers are needed to identify collisions that might be of interest to scientists, whether it is a hint of dark matter or a Higgs particle.

Scientists from Fermilab, CERN, MIT, the University of Washington and elsewhere have already tested an automatic machine learning system that speeds up treatment by 30 to 175 times compared to existing methods.

These methods currently process less than one frame per second. In contrast, the new machine learning system can examine up to 600 frames per second. During its training period, the system learned to select a specific type of post-collision particle pattern.

"The collision models we identify, the high-level quarks, are one of the fundamental particles we are probing at the Large Hadron Collider," said Harris, a member of the MIT Department of Physics. "It's very important that we analyze as much data as possible. Each data contains interesting information about the interaction of particles. "

These data are pouring in like never before after upgrades to current LHC upgrades; By 2026, the 17 – mile particle accelerator is expected to produce 20 times more data than today. To make things even more urgent, future images will also be taken at higher resolutions than they are now. In total, scientists and engineers estimate that the LHC will need more than 10 times the computing power it currently has.

"The challenge of future racing," says Harris, "is getting harder and harder as our calculations become more precise and we look for more and more precise effects."

Project researchers have formed their new system to identify high-quark quarks, the most massive elementary particle type, 180 times heavier than a proton. "With the machine-learning architectures at our disposal, we are able to deliver high-quality, scientific-quality results comparable to the world's best quark top identification algorithms," says Harris. "The implementation of high-speed core algorithms gives us the flexibility to improve LHC computing in critical moments when it is needed most."

[ad_2]

Source link