[ad_1]
<div _ngcontent-c16 = "" innerhtml = "
Above the large hadron collider, the protons rotate simultaneously in a clockwise and anti-clockwise direction, hitting the speed of light at 99.9999991% each. On two specific points designed to have the largest number of collisions, huge particle detectors were built and installed: the CMS and ATLAS detectors. After billions of billions of collisions with these huge energies, the LHC has taken us further in our quest for the fundamental nature of the Universe and our understanding of the elemental constituents of matter.
Earlier this month, the LHC celebrated 10 years of operation, with the discovery of the Higgs boson marking its crowning glory. Yet, despite these successes, no new particle, interaction, decay or fundamental physics has been found. Worst of all, most CERN data from the LHC have been permanently lost.
It is one of the least understood pieces of the puzzle of high energy physics, at least among the general public. The LHC has not only lost most of its data: it has lost 99.9999%. That's true; over a million collisions occurring at the LHC, only one of them has all its data written and recorded.
It's something that has happened out of necessity, because of the limitations imposed by the laws of nature themselves, as well as what technology can do right now. But in making this decision, there is a tremendous fear made all the more palpable by the fact that, aside from the highly anticipated Higgs, nothing new has been discovered. The fear is this: there is a new physics to discover, but we missed that data.
We had no choice in this area, really. Something had to be thrown away. The function of the LHC is to accelerate the protons as close as possible to the speed of light in opposite directions and to break them together. This is how particle accelerators have worked best for generations. According to Einstein, the energy of a particle is a combination of its mass of rest (which you can recognize as E = mc2) and the energy of its movement, also called kinetic energy. The faster you go – or more precisely, the closer you get to the speed of light – more energy per particle can be achieved.
At the LHC, we encounter protons together at 299,792,455 m / s, at only 3 m / s of the speed of light itself. By breaking them together at such speeds, moving in opposite directions, we allow particles otherwise impossible to exist.
The reason is that all the particles (and antiparticles) that we can create have a certain amount of inherent energy, in the form of their mass at rest. When you break two particles together, part of that energy must enter into the individual components of these particles, both their resting energy and their kinetic energy (ie their energy of motion).
But if you have enough energy, some of this energy can also enter the production of new particles! It's here that E = mc2 becomes really interesting: not only all particles with a mass (m) have energy (E) inherent to their existence, but if you have enough energy available, you can create new particles. At the LHC, humanity has reached collisions with more energy available for the creation of new particles than in any other laboratory in history.
The energy per particle is about 7 TeV, which means that each proton reaches about 7000 times its resting mass energy as kinetic energy. But collisions are rare and protons are not only tiny, but mostly empty spaces. To have a high probability of collision, you must place several protons at a time. you inject your protons into the packages instead.
At full intensity, this means that there are many small clusters of protons clockwise and counterclockwise within the LHC every time it turns. The LHC tunnels are about 26 kilometers long, with only 7.5 meters (or about 25 feet) separating each group. As these beams move, they collapse as they interact in the middle of each detector. Every 25 nanoseconds, there is a risk of collision.
So what are you doing? Do you have a small number of collisions and register all? This is a waste of energy and potential data.
Instead, you pump enough protons in each group to make sure you have a good collision every time two packets pass. And every time you have a collision, the particles pass through the detector in all directions, triggering the complex electronics and circuits that allow us to reconstruct what has been created, when and where in the detector. It's like a giant explosion, and it's only by measuring all the splinters that can come out that we can piece together what has happened (and what new things have been created) at the moment of ignition.
The problem then is to take all these data and record them. The detectors themselves are big: 22 meters for CMS and 46 meters for ATLAS. At any time, there are particles from three different collisions in CMS and six separate collisions in ATLAS. To save data, two steps must be taken:
- The data must be moved into the detector's memory, which is limited by the speed of your electronic components. Even at the speed of light, we can only "remember" collisions out of 1,000.
- The data in memory must be written to the disk (or other permanent device), which is much slower than storing data in memory. Only about 1 collision per 1,000 that memories can store on disk.
This is why, with the need to follow these two steps, only 0.0001% of the total data can be saved for analysis.
How do we know that we are backing up the right data? Most likely to create new particles, to see the importance of new interactions or to observe new physics?
When you have proton-proton collisions, most of it comes out of normal particles, in that they are composed almost exclusively of quarks from top to bottom. (This means that particles like protons, neutrons and pions.) And most collisions are collisions, which means that most of the particles end up hitting the detector forwards or backwards.
So, to do this first step, we try to look for traces of relatively high energy particles that go in the transverse direction rather than forward or backward. We try to put in the memory of the detector the events which, in our opinion, have the most energy available (E) to create new particles of the highest mass (mpossible. Then we quickly perform a computational analysis of what is in the detector's memory to see if it's worth writing to the disk or not. If we choose to do it, that's the only thing the detector will write for about 1 / 40th of a second or so.
1 / 40th of a second may seem insufficient, but it is about 25,000,000 nanoseconds: enough time for a million packets to collide.
We believe that we are making the right choice by choosing to save what we save, but we can not be sure. In 2010, the CERN Data Center achieved a milestone in data: 10 petabytes of data. By the end of 2013, they had spent 100 petabytes of data; in 2017, they passed the 200 petabytes mark. Yet we know that we have thrown – or failed to record – about 1,000,000 times that amount. We may have collected hundreds of petabytes, but we rejected and lost forever hundreds of zettabytes: more than the total amount of Internet data created over a year.
It is quite possible that the LHC has created new particles, seen evidence of new interactions and observed and recorded all signs of new physics. And it is also possible, because of our ignorance of what we were looking for, we have thrown everything away and will continue to do it. The nightmare scenario – without new physics beyond the standard model – seems to be coming true. But the real nightmare is the very real possibility that new physics is present, we have built the perfect machine to find it, we have found it and we will never achieve it because of the decisions and assumptions we have made . . The real nightmare is that we were mistaken in believing that the standard model is correct because we have only examined one millionth of the available data. Perhaps the nightmare is a nightmare that we brought ourselves.
">
Above the large hadron collider, the protons rotate simultaneously in a clockwise and anti-clockwise direction, hitting the speed of light at 99.9999991% each. On two specific points designed to have the largest number of collisions, huge particle detectors were built and installed: the CMS and ATLAS detectors. After billions of billions of collisions with these huge energies, the LHC has taken us further in our quest for the fundamental nature of the Universe and our understanding of the elemental constituents of matter.
Earlier this month, the LHC celebrated 10 years of operation, with the discovery of the Higgs boson marking its crowning glory. Yet, despite these successes, no new particle, interaction, decay or fundamental physics has been found. Worst of all, most CERN data from the LHC have been permanently lost.
It is one of the least understood pieces of the puzzle of high energy physics, at least among the general public. The LHC has not only lost most of its data: it has lost 99.9999%. That's true; over a million collisions occurring at the LHC, only one of them has all its data written and recorded.
It's something that has happened out of necessity, because of the limitations imposed by the laws of nature themselves, as well as what technology can do right now. But in making this decision, there is a tremendous fear made all the more palpable by the fact that, aside from the highly anticipated Higgs, nothing new has been discovered. The fear is this: there is a new physics to discover, but we missed that data.
We had no choice in this area, really. Something had to be thrown away. The function of the LHC is to accelerate the protons as close as possible to the speed of light in opposite directions and to break them together. This is how particle accelerators have worked best for generations. According to Einstein, the energy of a particle is a combination of its mass of rest (which you can recognize as E = mc2) and the energy of its movement, also called kinetic energy. The faster you go – or more precisely, the closer you get to the speed of light – more energy per particle can be achieved.
At the LHC, we encounter protons together at 299,792,455 m / s, at only 3 m / s of the speed of light itself. By breaking them together at such speeds, moving in opposite directions, we allow particles otherwise impossible to exist.
The reason is that all the particles (and antiparticles) that we can create have a certain amount of inherent energy, in the form of their mass at rest. When you break two particles together, part of that energy must enter into the individual components of these particles, both their resting energy and their kinetic energy (ie their energy of motion).
But if you have enough energy, some of this energy can also enter the production of new particles! It's here that E = mc2 becomes really interesting: not only all particles with a mass (m) have energy (E) inherent to their existence, but if you have enough energy available, you can create new particles. At the LHC, humanity has reached collisions with more energy available for the creation of new particles than in any other laboratory in history.
The energy per particle is about 7 TeV, which means that each proton reaches about 7000 times its resting mass energy as kinetic energy. But collisions are rare and protons are not only tiny, but mostly empty spaces. To have a high probability of collision, you must place several protons at a time. you inject your protons into the packages instead.
At full intensity, this means that there are many small clusters of protons clockwise and counterclockwise within the LHC every time it turns. The LHC tunnels are about 26 kilometers long, with only 7.5 meters (or about 25 feet) separating each group. As these beams move, they collapse as they interact in the middle of each detector. Every 25 nanoseconds, there is a risk of collision.
So what are you doing? Do you have a small number of collisions and register all? This is a waste of energy and potential data.
Instead, you pump enough protons in each group to make sure you have a good collision every time two packets pass. And every time you have a collision, the particles pass through the detector in all directions, triggering the complex electronics and circuits that allow us to reconstruct what has been created, when and where in the detector. It's like a giant explosion, and it's only by measuring all the splinters that can come out that we can piece together what has happened (and what new things have been created) at the moment of ignition.
The problem then is to take all these data and record them. The detectors themselves are big: 22 meters for CMS and 46 meters for ATLAS. At any time, there are particles from three different collisions in CMS and six separate collisions in ATLAS. To save data, two steps must be taken:
- The data must be moved into the detector's memory, which is limited by the speed of your electronic components. Even at the speed of light, we can only "remember" collisions out of 1,000.
- The data in memory must be written to the disk (or other permanent device), which is much slower than storing data in memory. Only about 1 collision per 1,000 that memories can store on disk.
This is why, with the need to follow these two steps, only 0.0001% of the total data can be saved for analysis.
How do we know that we are backing up the right data? Most likely to create new particles, to see the importance of new interactions or to observe new physics?
When you have proton-proton collisions, most of it comes out of normal particles, in that they are composed almost exclusively of quarks from top to bottom. (This means that particles like protons, neutrons and pions.) And most collisions are collisions, which means that most of the particles end up hitting the detector forwards or backwards.
So, to do this first step, we try to look for traces of relatively high energy particles that go in the transverse direction rather than forward or backward. We try to put in the memory of the detector the events which, in our opinion, have the most energy available (E) to create new particles of the highest mass (m) possible Next, we quickly perform a computational analysis of what is in the detector's memory to see if it is worth writing on the disk or not. If we choose to do it, that's the only thing the detector will write for about 1 / 40th of a second or so.
1 / 40th of a second may seem insufficient, but it is about 25,000,000 nanoseconds: enough time for a million packets to collide.
We believe that we are making the right choice by choosing to save what we save, but we can not be sure. In 2010, the CERN Data Center achieved a milestone in data: 10 petabytes of data. By the end of 2013, they had spent 100 petabytes of data; in 2017, they passed the 200 petabytes mark. Yet we know that we have thrown – or failed to record – about 1,000,000 times that amount. We may have collected hundreds of petabytes, but we rejected and lost forever hundreds of zettabytes: more than the total amount of Internet data created over a year.
It is quite possible that the LHC has created new particles, seen evidence of new interactions and observed and recorded all signs of new physics. And it is also possible, because of our ignorance of what we were looking for, we have thrown everything away and will continue to do it. The nightmare scenario – without new physics beyond the standard model – seems to be coming true. But the real nightmare is the very real possibility that new physics is present, we have built the perfect machine to find it, we have found it and we will never achieve it because of the decisions and assumptions we have made . . The real nightmare is that we were mistaken in believing that the standard model is correct because we have only examined one millionth of the available data. Perhaps the nightmare is a nightmare that we brought ourselves.