By Andy Thurai, Emerging Technology Strategist
In 2014, NASA lost a crucial instrument housed on the Solar Dynamics Observatory (SDO) satellite that measured extreme UV rays coming from the sun. With repair costs ranging from the millions to billions of dollars, a team from NASA Frontier Development Lab and IBM turned to artificial intelligence and historic data to see if a well-trained model could fill the data void.
I was very intrigued when I heard about this project from my good friends at NASA FDL and IBM recently. What if artificial intelligence can decipher more than images of dogs, cats, and stop signs? What could we learn from looking at images of the sun?
Sunshine is Good for the Soul
The sun is considered the life creator in mythological, mystical and scientific worlds. While there may be other similar solar systems with a star providing as much value as the sun does to us, the impact of the sun in our solar system is immeasurable. Any changes in the sun’s power, however small they are, whether they are sun spots, solar flares, or coronal mass ejections, all affect the earth directly. When the sun misbehaves like that, it affects systems such as Global Navigation systems (GPS), satellites, radio systems, computers, cell phones, electrical systems, air traffic control, and electric power.
For instance, our last major solar storm event on July 23, 2012, the strongest recorded to date, missed the earth by about a week. If it had hit earth directly, it could have had a “catastrophic effect,” blowing out the majority of electrical, electronic, and communications systems in the world. A study by The National Academy of Sciences estimates a direct hit by such a storm could cause damages as high as $2 trillion. A much weaker event in March 1989 knocked out power for the entire province of Quebec for weeks.
Solar Dynamics Observatory is a Cool Satellite
To keep tabs on such solar shenanigans, NASA launched the Solar Dynamics Observatory (SDO) satellite in 2010 at a cost of about $850 million. The SDO satellite collects various measurements from the sun in hopes of forecasting solar storms and mitigating their effects in and around earth’s space.
The SDO satellite has three major instrument components:
Atmospheric Imaging Assembly (AIA) – captures images of the solar atmosphere in multiple wavelengths (up to 10) for every 10 seconds in IMAX resolution (x10 times the precision of HD images). In other words, this measures what is happening in the sun’s atmosphere.
EUV Variability Experiment (EVE) – measures the solar extreme ultraviolet radiation (EUV) to understand the influence on earth’s (and near-earth space’s) climate changes.
Helioseismic and Magnetic Imager (HMI) – studies the oscillations and the magnetic field at the solar surface, or photosphere.
Together these three instruments continuously monitored the sun, producing about 1 TB of data every day.
But the Cool Satellite Broke!
In 2014, a critical component of the EVE instrument broke, and true EUV measurements were no longer available to satellite operators.
This was bad news. First, the EUV varies the most in the sun’s spectrum so a constant measure above earth’s atmosphere can give us good insight. Second, EUV photons emanating from the sun are absorbed in the upper atmosphere and in the ionosphere, so taking a measurement above those layers is very critical. Third, these extreme variations in EUV can cause dramatic effects on the earth’s outer atmosphere. It could potentially cause the outer atmosphere to balloon much bigger than it normally is, which can have costly effects on all other satellites.
Fixing the issue was prohibitively expensive, ranging from sending a manned mission to the satellite (costing upward of $500 million) or launching a new satellite which might cost around a $1 billion.
As those options were not viable, the scientists and engineers from at NASA FDL, IBM, and Nimbix came up with a thought: could AI provide a solution?
AI to the Rescue
The three instruments on the SDO worked well from 2010 to mid-2014. Could deep learning neural networks predict the missing EVE data based on analyzing terabytes of data from the past four years with hundreds of possible models and variations?
“Imagine that you had listened to a symphony playing music for four years,” said Graham Mackintosh at NASA FDL, “and then one of the musicians suddenly stopped playing. Would you be able to mentally fill in the missing music from the performer who had gone silent? This is what the NASA FDL team wanted to do with the symphony of data coming from NASA’s Solar Dynamics Observatory.”
Lucky for the engineers, the AIA and the EVE had produced four years of harmonious data—high resolution images of the sun and corresponding EUV measurements—with which to create models and test them. As I preach often, AI is based on the quality and quantity of data that you use to create the model/algorithm. The research team created a “Machine Learning bake off,” in which they created 1,000s of models to validate the hypothesis. After trying multiple architectures such as Linear, Multi-layer Perceptron (MLP), Convolutional Neural Networks (CNN), and Augmented CNN, they determined that Augmented CNN closely fit their needs.
CNN is the sub-section of AI that specializes in analyzing visual imagery and in “deep learning” from images. For example, when you are analyzing an image, it is not enough to recognize a certain gesture, but you also need to understand what that gesture means in a certain culture. Similarly, the scientists wanted to analyze the superior images of the sun generated by AIA and predict the EUV radiation measurements.
Using MacGyver-ish AI Tools
Impressively, the engineers took to the task using only common software and hardware tools: Jupyter notebook, PyTorch, NVIDIA GPUs, IBM Watson AI, and Nimbix cloud provider to host this all. Each one of the tools was chosen for a specific reason. Jupyter notebook is the easiest way for engineers to collaborate. NVIDIA is the best GPU available today. IBM AI tools are designed to solve enterprise AI problems. And Nimbix cloud is the best AI cloud out there.
The team broke the dataset into four parts. The model was able to crunch the first year’s worth of daily TB data and created a solid AI model. After multiple iterations and training using a full year of data, the engineers put it to task on a second year’s worth of data. After learning from that, they used a third year of data to retrain the model and finally put the final model to test on the fourth year of data.
The results were 97.5% accurate. Now, AI can process the high-quality images generated by the AIA and supply EUV data for the years since the EVE has not been working (mid 2014 to date).
If AI can figure out the missing data from sun based on current inputs, could we also predict the EUV spectra into the future with precision ahead of time? Predicting solar changes would have dramatic impacts on earth. Moreover, this technique for using AI to fill in “data gaps,” based on surrounding information, could be used in other applications, such as an IoT installation when a sensor malfunctions, or if there is a portion of a customer satisfaction survey that is often, but not always, skipped by clients of a financial services company. As is often the case with AI, the sky is the limit!
Andy Thurai is an accomplished professional with 25+ years of experience in technical, biz dev and architecture leadership positions. Learn more at his LinkedIn page.