Has the large hadron collider accidentally disappeared?



[ad_1]
<div _ngcontent-c16 = "" innerhtml = "

The Large Hadron Collider (LHC) ATLAS particle detector at the European Center for Nuclear Research (CERN) in Geneva, Switzerland. Built in an underground tunnel 27 km in circumference, the CERN LHC is the largest and most powerful particle collider in the world and the largest machine in the world. It can only record a tiny fraction of the data collected.

Above the large hadron collider, the protons rotate simultaneously in a clockwise and anti-clockwise direction, hitting the speed of light at 99.9999991% each. On two specific points designed to have the largest number of collisions, huge particle detectors were built and installed: the CMS and ATLAS detectors. After billions of billions of collisions with these huge energies, the LHC has taken us further in our quest for the fundamental nature of the Universe and our understanding of the elemental constituents of matter.

Earlier this month, the LHC celebrated 10 years of operation, with the discovery of the Higgs boson marking its crowning glory. Yet, despite these successes, no new particle, interaction, decay or fundamental physics has been found. Worst of all, most CERN data from the LHC have been permanently lost.

The CMS collaboration, whose detector is shown before the final assembly, has released its latest most comprehensive results. There is no indication of physics beyond the standard model in the results.CERN / Maximlien Brice

It is one of the least understood pieces of the puzzle of high energy physics, at least among the general public. The LHC has not only lost most of its data: it has lost 99.9999%. That's true; over a million collisions occurring at the LHC, only one of them has all its data written and recorded.

It's something that has happened out of necessity, because of the limitations imposed by the laws of nature themselves, as well as what technology can do right now. But in making this decision, there is a tremendous fear made all the more palpable by the fact that, aside from the highly anticipated Higgs, nothing new has been discovered. The fear is this: there is a new physics to discover, but we missed that data.

A four-muon candidate event in the ATLAS detector of the Large Hadron Collider. Muon / anti-muon tracks are highlighted in red because long-lived muons travel farther than any other unstable particle. This is an interesting event, but for every event recorded, a million others are rejected.ATLAS / CERN Collaboration

We had no choice in this area, really. Something had to be thrown away. The function of the LHC is to accelerate the protons as close as possible to the speed of light in opposite directions and to break them together. This is how particle accelerators have worked best for generations. According to Einstein, the energy of a particle is a combination of its mass of rest (which you can recognize as E = mc2) and the energy of its movement, also called kinetic energy. The faster you go – or more precisely, the closer you get to the speed of light – more energy per particle can be achieved.

At the LHC, we encounter protons together at 299,792,455 m / s, at only 3 m / s of the speed of light itself. By breaking them together at such speeds, moving in opposite directions, we allow particles otherwise impossible to exist.

Inside the LHC, where the protons intersect at 299,792,455 m / s, just 3 m / s of the speed of light.Julian Herzog / c.c.a-by-3.0

The reason is that all the particles (and antiparticles) that we can create have a certain amount of inherent energy, in the form of their mass at rest. When you break two particles together, part of that energy must enter into the individual components of these particles, both their resting energy and their kinetic energy (ie their energy of motion).

But if you have enough energy, some of this energy can also enter the production of new particles! It's here that E = mc2 becomes really interesting: not only all particles with a mass (m) have energy (E) inherent to their existence, but if you have enough energy available, you can create new particles. At the LHC, humanity has reached collisions with more energy available for the creation of new particles than in any other laboratory in history.

There was a wide variety of potential new physical signatures that physicists were looking for at the LHC, from extra dimensions to dark matter to supersymmetric particles and black micro-holes. Despite all the data we collected from these high-energy collisions, none of these scenarios showed any evidence supporting their existence.CERN / ATLAS Experience

The energy per particle is about 7 TeV, which means that each proton reaches about 7000 times its resting mass energy as kinetic energy. But collisions are rare and protons are not only tiny, but mostly empty spaces. To have a high probability of collision, you must place several protons at a time. you inject your protons into the packages instead.

At full intensity, this means that there are many small clusters of protons clockwise and counterclockwise within the LHC every time it turns. The LHC tunnels are about 26 kilometers long, with only 7.5 meters (or about 25 feet) separating each group. As these beams move, they collapse as they interact in the middle of each detector. Every 25 nanoseconds, there is a risk of collision.

The CERN CMS detector, one of the two most powerful particle detectors ever assembled. Every 25 nanoseconds on average, a new group of particles collides with the center of the detector.CERN

So what are you doing? Do you have a small number of collisions and register all? This is a waste of energy and potential data.

Instead, you pump enough protons in each group to make sure you have a good collision every time two packets pass. And every time you have a collision, the particles pass through the detector in all directions, triggering the complex electronics and circuits that allow us to reconstruct what has been created, when and where in the detector. It's like a giant explosion, and it's only by measuring all the splinters that can come out that we can piece together what has happened (and what new things have been created) at the moment of ignition.

An event of the Higgs boson, as seen in the compact muon solenoid detector of the Large Hadron Collider. This spectacular collision is 15 orders of magnitude below Planck's energy, but it is the precision measurements of the detector that allow us to reconstruct what happened at the point of collision (and nearly of it).CERN / CMS Collaboration

The problem then is to take all these data and record them. The detectors themselves are big: 22 meters for CMS and 46 meters for ATLAS. At any time, there are particles from three different collisions in CMS and six separate collisions in ATLAS. To save data, two steps must be taken:

  1. The data must be moved into the detector's memory, which is limited by the speed of your electronic components. Even at the speed of light, we can only "remember" collisions out of 1,000.
  2. The data in memory must be written to the disk (or other permanent device), which is much slower than storing data in memory. Only about 1 collision per 1,000 that memories can store on disk.

This is why, with the need to follow these two steps, only 0.0001% of the total data can be saved for analysis.

A Higgs candidate event in the ATLAS detector. Note how even with clear signatures and cross tracks, there is a rain of other particles; this is because protons are composite particles. This is the case only because the Higgs gives mass to the fundamental constituents that make up these particles.The ATLAS / CERN collaboration

How do we know that we are backing up the right data? Most likely to create new particles, to see the importance of new interactions or to observe new physics?

When you have proton-proton collisions, most of it comes out of normal particles, in that they are composed almost exclusively of quarks from top to bottom. (This means that particles like protons, neutrons and pions.) And most collisions are collisions, which means that most of the particles end up hitting the detector forwards or backwards.

Accelerators of particles on Earth, such as the LHC at CERN, can accelerate particles very close, but not quite, to the speed of light. Because protons are composite particles and they move so close to the speed of light, most particle collisions result in forward or backward scattering of the particles. particles and not by transverse events.LHC / CERN

So, to do this first step, we try to look for traces of relatively high energy particles that go in the transverse direction rather than forward or backward. We try to put in the memory of the detector the events which, in our opinion, have the most energy available (E) to create new particles of the highest mass (mpossible. Then we quickly perform a computational analysis of what is in the detector's memory to see if it's worth writing to the disk or not. If we choose to do it, that's the only thing the detector will write for about 1 / 40th of a second or so.

1 / 40th of a second may seem insufficient, but it is about 25,000,000 nanoseconds: enough time for a million packets to collide.

Particle traces resulting from a high-energy collision at the LHC in 2014. Only 1 000 000 such collisions were recorded and recorded; the majority was lost.

We believe that we are making the right choice by choosing to save what we save, but we can not be sure. In 2010, the CERN Data Center achieved a milestone in data: 10 petabytes of data. By the end of 2013, they had spent 100 petabytes of data; in 2017, they passed the 200 petabytes mark. Yet we know that we have thrown – or failed to record – about 1,000,000 times that amount. We may have collected hundreds of petabytes, but we rejected and lost forever hundreds of zettabytes: more than the total amount of Internet data created over a year.

The total amount of data collected by the LHC far exceeds the total amount of data sent and received over the last 10 years. But only 0.0001% of this data was written and recorded; the rest is gone for good.

It is quite possible that the LHC has created new particles, seen evidence of new interactions and observed and recorded all signs of new physics. And it is also possible, because of our ignorance of what we were looking for, we have thrown everything away and will continue to do it. The nightmare scenario – without new physics beyond the standard model – seems to be coming true. But the real nightmare is the very real possibility that new physics is present, we have built the perfect machine to find it, we have found it and we will never achieve it because of the decisions and assumptions we have made . . The real nightmare is that we were mistaken in believing that the standard model is correct because we have only examined one millionth of the available data. Perhaps the nightmare is a nightmare that we brought ourselves.

">

The ATLAS Particle Detector of the Large Hadron Collider (LHC) of the European Center for Nuclear Research (CERN) in Geneva, Switzerland. Built in an underground tunnel 27 km in circumference, the CERN LHC is the largest and most powerful particle collider in the world and the largest machine in the world. It can only record a tiny fraction of the data collected.

Above the large hadron collider, the protons rotate simultaneously in a clockwise and anti-clockwise direction, hitting the speed of light at 99.9999991% each. On two specific points designed to have the largest number of collisions, huge particle detectors were built and installed: the CMS and ATLAS detectors. After billions of billions of collisions with these huge energies, the LHC has taken us further in our quest for the fundamental nature of the Universe and our understanding of the elemental constituents of matter.

Earlier this month, the LHC celebrated 10 years of operation, with the discovery of the Higgs boson marking its crowning glory. Yet, despite these successes, no new particle, interaction, decay or fundamental physics has been found. Worst of all, most CERN data from the LHC have been permanently lost.

The CMS collaboration, whose detector is shown before the final assembly, has released its latest most comprehensive results. There is no indication of physics beyond the standard model in the results.CERN / Maximlien Brice

It is one of the least understood pieces of the puzzle of high energy physics, at least among the general public. The LHC has not only lost most of its data: it has lost 99.9999%. That's true; over a million collisions occurring at the LHC, only one of them has all its data written and recorded.

It's something that has happened out of necessity, because of the limitations imposed by the laws of nature themselves, as well as what technology can do right now. But in making this decision, there is a tremendous fear made all the more palpable by the fact that, aside from the highly anticipated Higgs, nothing new has been discovered. The fear is this: there is a new physics to discover, but we missed that data.

A four-muon candidate event in the ATLAS detector of the Large Hadron Collider. Muon / anti-muon tracks are highlighted in red because long-lived muons travel farther than any other unstable particle. This is an interesting event, but for every event recorded, a million others are rejected.ATLAS / CERN Collaboration

We had no choice in this area, really. Something had to be thrown away. The function of the LHC is to accelerate the protons as close as possible to the speed of light in opposite directions and to break them together. This is how particle accelerators have worked best for generations. According to Einstein, the energy of a particle is a combination of its mass of rest (which you can recognize as E = mc2) and the energy of its movement, also called kinetic energy. The faster you go – or more precisely, the closer you get to the speed of light – more energy per particle can be achieved.

At the LHC, we encounter protons together at 299,792,455 m / s, at only 3 m / s of the speed of light itself. By breaking them together at such speeds, moving in opposite directions, we allow particles otherwise impossible to exist.

Inside the LHC, where the protons intersect at 299,792,455 m / s, just 3 m / s of the speed of light.Julian Herzog / c.c.a-by-3.0

The reason is that all the particles (and antiparticles) that we can create have a certain amount of inherent energy, in the form of their mass at rest. When you break two particles together, part of that energy must enter into the individual components of these particles, both their resting energy and their kinetic energy (ie their energy of motion).

But if you have enough energy, some of this energy can also enter the production of new particles! It's here that E = mc2 becomes really interesting: not only all particles with a mass (m) have energy (E) inherent to their existence, but if you have enough energy available, you can create new particles. At the LHC, humanity has reached collisions with more energy available for the creation of new particles than in any other laboratory in history.

There was a wide variety of potential new physical signatures that physicists were looking for at the LHC, from extra dimensions to dark matter to supersymmetric particles and black micro-holes. Despite all the data we collected from these high-energy collisions, none of these scenarios showed any evidence supporting their existence.CERN / ATLAS Experience

The energy per particle is about 7 TeV, which means that each proton reaches about 7000 times its resting mass energy as kinetic energy. But collisions are rare and protons are not only tiny, but mostly empty spaces. To have a high probability of collision, you must place several protons at a time. you inject your protons into the packages instead.

At full intensity, this means that there are many small clusters of protons clockwise and counterclockwise within the LHC every time it turns. The LHC tunnels are about 26 kilometers long, with only 7.5 meters (or about 25 feet) separating each group. As these beams move, they collapse as they interact in the middle of each detector. Every 25 nanoseconds, there is a risk of collision.

The CERN CMS detector, one of the two most powerful particle detectors ever assembled. Every 25 nanoseconds on average, a new group of particles collides with the center of the detector.CERN

So what are you doing? Do you have a small number of collisions and register all? This is a waste of energy and potential data.

Instead, you pump enough protons in each group to make sure you have a good collision every time two packets pass. And every time you have a collision, the particles pass through the detector in all directions, triggering the complex electronics and circuits that allow us to reconstruct what has been created, when and where in the detector. It's like a giant explosion, and it's only by measuring all the splinters that can come out that we can piece together what has happened (and what new things have been created) at the moment of ignition.

An event of the Higgs boson, as seen in the compact muon solenoid detector of the Large Hadron Collider. This spectacular collision is 15 orders of magnitude below Planck's energy, but it is the precision measurements of the detector that allow us to reconstruct what happened at the point of collision (and nearly of it).CERN / CMS Collaboration

The problem then is to take all these data and record them. The detectors themselves are big: 22 meters for CMS and 46 meters for ATLAS. At any time, there are particles from three different collisions in CMS and six separate collisions in ATLAS. To save data, two steps must be taken:

  1. The data must be moved into the detector's memory, which is limited by the speed of your electronic components. Even at the speed of light, we can only "remember" collisions out of 1,000.
  2. The data in memory must be written to the disk (or other permanent device), which is much slower than storing data in memory. Only about 1 collision per 1,000 that memories can store on disk.

This is why, with the need to follow these two steps, only 0.0001% of the total data can be saved for analysis.

A candidate Higgs event in the ATLAS detector. Note how even with clear signatures and cross tracks, there is a rain of other particles; this is because protons are composite particles. This is the case only because the Higgs gives mass to the fundamental constituents that make up these particles.The ATLAS / CERN collaboration

How do we know that we are backing up the right data? Most likely to create new particles, to see the importance of new interactions or to observe new physics?

When you have proton-proton collisions, most of it comes out of normal particles, in that they are composed almost exclusively of quarks from top to bottom. (This means that particles like protons, neutrons and pions.) And most collisions are collisions, which means that most of the particles end up hitting the detector forwards or backwards.

Accelerators of particles on Earth, such as the LHC at CERN, can accelerate particles very close, but not quite, to the speed of light. Because protons are composite particles and they move so close to the speed of light, most particle collisions result in forward or backward scattering of the particles. particles and not by transverse events.LHC / CERN

So, to do this first step, we try to look for traces of relatively high energy particles that go in the transverse direction rather than forward or backward. We try to put in the memory of the detector the events which, in our opinion, have the most energy available (E) to create new particles of the highest mass (m) possible Next, we quickly perform a computational analysis of what is in the detector's memory to see if it is worth writing on the disk or not. If we choose to do it, that's the only thing the detector will write for about 1 / 40th of a second or so.

1 / 40th of a second may seem insufficient, but it is about 25,000,000 nanoseconds: enough time for a million packets to collide.

Particle traces resulting from a high-energy collision at the LHC in 2014. Only 1 000 000 such collisions were recorded and recorded; the majority was lost.

We believe that we are making the right choice by choosing to save what we save, but we can not be sure. In 2010, the CERN Data Center achieved a milestone in data: 10 petabytes of data. By the end of 2013, they had spent 100 petabytes of data; in 2017, they passed the 200 petabytes mark. Yet we know that we have thrown – or failed to record – about 1,000,000 times that amount. We may have collected hundreds of petabytes, but we rejected and lost forever hundreds of zettabytes: more than the total amount of Internet data created over a year.

The total amount of data collected by the LHC far exceeds the total amount of data sent and received over the last 10 years. But only 0.0001% of this data was written and recorded; the rest is gone for good.

It is quite possible that the LHC has created new particles, seen evidence of new interactions and observed and recorded all signs of new physics. And it is also possible, because of our ignorance of what we were looking for, we have thrown everything away and will continue to do it. The nightmare scenario – without new physics beyond the standard model – seems to be coming true. But the real nightmare is the very real possibility that new physics is present, we have built the perfect machine to find it, we have found it and we will never achieve it because of the decisions and assumptions we have made . . The real nightmare is that we were mistaken in believing that the standard model is correct because we have only examined one millionth of the available data. Perhaps the nightmare is a nightmare that we brought ourselves.

[ad_2]
Source link