The world's first black hole photo was an epic feat of data storage



[ad_1]

OOn Wednesday, astronomers announced that they had captured the first image of a black hole in the world – and that the Internet could not handle it. No, we are not talking about black hole memes of Shrek or sneaky opinion pieces about how this object image at 55 million years of light was "so fuzzy". We talk about how the internet Literally can not handle the amount of data collected by the eight telescopes on the five continents that make up the Event Horizon Telescope event, which captures this image of the black hole in the center of the Messier 87 galaxy.

Instead, the large amount of data collected by the radio antennas had to be flown to central data centers where it could be cleaned and analyzed. Thus, besides being a massive achievement of ingenuity and understanding, confirming several theories about black holes, the M87 black hole image was also a Herculean exploit of storage and data management.

In April 2017, over seven days, the EHT experiment steered the eight telescopes towards the M87. Synchronized by custom atomic clocks, they all started collecting incoming radio signals from the remote black hole and recording the data on high-speed data loggers designed for this task.

There is no Internet capable of competing with 5 petabytes of data in an airplane. "

"We had 5 petabytes of recorded data," Dan Marrone, Ph.D., an associate professor of astronomy at the University of Arizona, specializes in storing data for the EHT experiment, told reporters Wednesday.

"That's more than half a ton of hard drives. Five petabytes is a lot data: this equates to 5,000 years of MP3 files. "

Here's why and how this photo requires the equivalent in data of 1.39 billion copies of Lil Nas X's "Old Town Road".

The ALMA observatory in Chile is one of eight telescopes to image the M87 black hole.
The ALMA observatory in Chile is one of eight telescopes to image the M87 black hole.

Eight synchronized telescopes

The EHT experiment used a technique called very long base interferometry, which used the eight simultaneous recording telescopes to essentially turn the Earth into a single rotating satellite dish. Each of these telescopes recorded raw incoming radio signals in the form of tons of data.

In other words, it's as if eight people were taking videos of the same distant phenomenon from different angles, then assembling all their videos to make a really clear video. In this scenario, the object was really far away and the telescopes far away.

The advantage of this long baseline between telescopes is that the rotation of the Earth gives scientists photos of the black hole under eight simultaneous angles.

Data cleaning

Once the 1,000 pounds of hard drives were filled with these 5 petabytes of raw data, they were loaded into planes and routed to two centralized "correlators" located in Massachusetts and Germany.

"The fastest way to do it is not via the Internet, but to get them on planes," said Marrone. "There is no Internet that can compete with 5 petabytes of data on an airplane."

In addition to this challenge, scientists had to wait until the summer to send the hard drives from the South Pole Telescope, as the images were captured during the winter of Antarctica.

The correlators then began to synchronize all telescope data. This means that the supercomputers took all the raw observation data collected by the telescopes and used the atomic clock information to align them, thus creating a continuous record of the light wavefront black hole as it reached the Earth.

Tool Exchange with Silicon Valley

Chi-Kwan Chan, Ph.D., a computational astrophysicist at the University of Arizona who worked on the computation for the M87 imaging project, tells reverse once the correlators cleaned up the data, the task became much more granular.

"After this step, people usually use a workstation and do the calculations," he says. "But my contribution has been to integrate cloud computing technology into collaboration to launch many powerful virtual machines in the cloud to analyze and accelerate data."

Chan and his team developed this software, which helped the ESH team clean the data even further to create the final composite image, which is only a few hundred kilobytes. He hopes that the technology industry will be able to use this software in the future for the network architecture.

"In this sense, we are also giving back to society," he says.

Notably, the computers at the University of Arizona that Chan and his team used to run black hole simulations are based on graphical processing units, which are very powerful in computation. These are the same graphics cards that have been the subject of extremely strong demand as they have become popular among miners of cryptocurrency. Thus, just as the black hole project created software that will remain unchanged, it was also inspired by a very different field of computer science, all in the name of discovery.

Ironically, Chan's team used these powerful GPUs to simulate as many black holes before observing the M87 as she already knew what to expect from the real black hole.

"We have created a huge library of black hole images," he says. "Because we saw a lot and we saw a lot of possibilities, we were not surprised to see the truth."

[ad_2]

Source link