The National Aeronautics and Space Administration (NASA) is leveraging the power of artificial intelligence (AI) technologies to tackle critical climate and environmental challenges, with a focus on making data more accessible and efficient. 

Speaking at the IBM Think Leadership Exchange in Washington, D.C., on Nov. 20, Chelle Gentemann, the open science program officer at NASA, said that the agency has been using AI models for various use cases including identifying flooding in India using large-scale satellite and geospatial data, and predicting locust breeding grounds in Africa using open-source models.  

The capabilities of these use cases, Gentemann said, are changing the ways that weather, climate, and environmental monitoring is being conducted.  

“We’ve now just released the weather and climate model based on data that has two billion parameters,” said Gentemann. “We’re now looking at … 40 years of weather and climate data. And within the encoder we’ve really fine tuned it for gap filling, for high resolution, but also for time. So, we can do both environmental monitoring but can also do prediction.” 

Working with partners such as IBM, Gentemann said that NASA has been developing certain AI technologies – such as encoders that handle transforming raw data. The challenge scientists face, she added, is not a lack of data but extracting meaningful knowledge from massive datasets, which AI helps with.  

After AI systems process data, complex information like orbital dynamics, infrared, microwave data, and physical and thermodynamic models is then interpreted by NASA scientists. 

“Bringing the scientific expertise together with the AI expertise, so that as we’re developing encoders [and] we know where our strengths are, what might be hallucinated – we can provide value and guidance to the next user, so that …[they] know what can be done accurately with these models,” said Gentemann. 

Other challenges NASA faces involve the sustainability of AI models and infrastructure. One way to mitigate the high costs of powering AI systems is turning to hardware innovations, said Priya Nagpurkar, the vice president of hybrid cloud and AI platform at IBM Research.  

“There’s now increasingly an awareness of energy efficiency [of AI systems],” said Nagpurkar. “It’s not just cost per millions of tokens, it’s also cost per millions of tokens per watt. And can we improve on that – especially working on climate science, this seems to be an equally important concern. So, we can see here that hardware innovations are going to be essential.” 

Continuing to have open-source models is a key tool in helping people access and interpret data, Gentemann added, making it more usable and beneficial as a public resource. 

“Federal agencies, we are interested in the public good,” said Gentemann. “We know that openness is going to reap them a benefit, it’ll be the most efficient and it will have the most benefit for the public good. And with a lot of different partners, you have to look beyond … what people say and what they actually do, and I think that’s one of the reasons that this [IBM] partnership works so well – is this deep held belief in the value of openness for public good.” 

Read More About
About
Weslan Hansen
Weslan Hansen
Weslan Hansen is a MeriTalk Staff Reporter covering the intersection of government and technology.
Tags