Artificial intelligence helps improve NASA’s eyes on the sun



[ad_1]

The top row of images shows the degradation of AIA’s 304 angstrom channel over the years since SDO’s launch. The bottom row of images is corrected for this degradation using a machine learning algorithm. Credit: Luiz dos Santos / NASA GSFC

A group of researchers are using artificial intelligence techniques to calibrate some of NASA’s images of the sun, helping to improve the data used by scientists for solar energy research. The new technology was published in the magazine Astronomy and astrophysics April 13, 2021.

The solar telescope has a difficult task. Watching the sun take its toll, with the constant bombardment of an endless stream of solar particles and intense sunlight. Over time, the sensitive lenses and sensors of solar telescopes begin to deteriorate. To ensure that the data returned from these instruments is always accurate, scientists periodically recalibrate to make sure they understand exactly how the instrument is changing.

Launched in 2010, NASA’s Solar Dynamics Observatory, or SDO, has been providing high-resolution images of the sun for more than a decade. His photographs have given scientists insight into the various solar phenomena that can lead to space weather and affect astronauts and technology on Earth and in space. The Aerial Imaging Array, or AIA, is one of two imaging tools on SDO because it is constantly staring at the sun, taking photos at 10 wavelengths of ultraviolet light every 12 seconds. This creates an unparalleled wealth of information about the Sun, but – like all sun observing instruments – AIA deteriorates over time and the data must be calibrated frequently.

Seven wavelength aerial photography kit

This image shows seven of the ultraviolet wavelengths observed by the Aerial Imaging Society aboard NASA’s Solar Dynamics Observatory. The top row represents the May 2010 ratings and the bottom row shows the 2019 ratings, without any corrections, showing how the tool has deteriorated over time. Credit: Luiz dos Santos / NASA GSFC

Since SDO’s launch, scientists have used sounding rockets to calibrate AIA. Sounding rockets are smaller rockets that typically carry only a few instruments and make short trips into space – typically only 15 minutes. Importantly, sounding rockets fly over most of Earth’s atmosphere, allowing on-board instruments to see the ultraviolet wavelengths measured by the AIA. These wavelengths of light are absorbed by the Earth’s atmosphere and cannot be measured from Earth. To calibrate the AIA, they connected an ultraviolet telescope to a sounding rocket and compared this data with the AIA measurements. Scientists can then make adjustments to account for any changes in the AIA data.

The sounding rocket calibration method has certain drawbacks. Sounding rockets can only be launched that often, but the AIA keeps an eye on the sun. This means there is a downtime where the calibration is slightly offset between each sounding missile calibration.

“It’s also important for deep space missions, which wouldn’t have the ability to calibrate a sounding rocket,” said Dr Luis dos Santos, a heliophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, and lead author of the article. We are addressing two issues at the same time.

default calibration

With these challenges in mind, scientists decided to explore other options for device calibration, with the goal of continuous calibration. Machine learning, an approach used in artificial intelligence, seemed to fit the bill.

As the name suggests, machine learning requires a computer program or algorithm to learn how to perform its task.

Sun AIA 2021

The Sun was seen by the AIA at 304 A light in 2021 before the deterioration was corrected (see image below with corrections from a rocket probe calibration). Credit: NASA GSFC

First, the researchers had to train a machine learning algorithm to recognize solar structures and how they compare using AIA data. They do this by giving the algorithm images of missile calibration flights and telling them the appropriate amount of calibration they need. After enough of these examples, they give the algorithm similar images and see if it will determine the correct calibration required. With enough data, the algorithm learns to determine the amount of calibration required for each image.

Sun AIA 2021 FIXED

Sun seen by the AIA in light from 304 Angstrom in 2021 with corrections to the calibration of the sounding rocket (see previous image above before correction of deterioration). Credit: NASA GSFC

Since AIA looks at the Sun at multiple wavelengths of light, researchers can also use the algorithm to compare specific structures across wavelengths and improve their assessments.

First, they were teaching the algorithm what a solar flare looks like by showing solar flares at all AIA wavelengths so that it could identify solar flares in all types of light. Once the software can identify the solar flare without any degradation, the algorithm can then determine the extent of degradation affecting existing AIA images and the amount of calibration needed for each.

“It was the important thing,” dos Santos said. “Instead of just identifying it at the same wavelength, we are identifying structures across wavelengths.”

This means that researchers can be more confident about the calibration determined by the algorithm. In fact, comparing their default calibration data with the sounding rocket calibration data, the machine learning software was perfect.

With this new process, researchers are ready to continuously calibrate AIA images between missile calibration flights, improving the accuracy of researchers’ SDO data.

Machine learning behind the sun

The researchers also used machine learning to better understand conditions close to home.

A group of researchers led by Dr Ryan McGrangan – chief data scientist and space engineer at ASTRA LLC and NASA’s Goddard Space Flight Center – learned about the machine used to understand the connection between Earth’s magnetic field and the ionosphere, the electrically charged part of the top of the Earth. atmosphere. Using data science techniques for large volumes of data, they can apply machine learning techniques to develop a more modern model that helps them better understand how energetic particles from space fall through the air. Earth’s atmosphere, where they determine space weather.

As machine learning advances, its scientific applications will expand to include more and more tasks. Looking to the future, this could mean that distant space missions – which travel to places where calibration rockets cannot be launched – can still be calibrated and continue to provide accurate data, further and further from Earth or any star.

Reference: “Multichannel Automatic Calibration of Aerial Image Compilation Using Machine Learning” By Louise FJ Dos Santos, Sovic Bos, Valentina Salvatelli, Brad Neuberg, Mark CM Cheung, Miho Janvier, Ming Jin, Yaren Gal , Paul Burner and Atilim Gunes Baden, April 13, 2021, Astronomy and astrophysics.
DOI: 10.1051 / 0004-6361 / 202040051



[ad_2]
Source link