Climate Model Code Is So Outdated, MIT Starts From Scratch

When faced with climate models coded in Fortran in the 1960s and 70s, MIT decided there wasn't any more cobbling together left for the ancient code, so they decided to toss it out and start fresh. 

It's an ambitious project for MIT professors Raffaele Ferrari and Noelle Eckley Selin, who submitted their Bringing Computation to the Climate Challenge proposal as part of MIT's Climate Grand Challenges (CGC). Out of 100 submissions, MIT picked five projects to fund and support, one of which is Ferrari and Selin's. 

"The goal of this grand challenge is to provide accurate and actionable scientific information to decision-makers to inform the most effective mitigation and adaptation strategies," the proposal said. 

Students can't read model code

Ferrari was part of a group called the Climate Modelling Alliance (CLiMA), which formed at Caltech with the Naval Postgraduate School and NASA's JPL in 2018 to modernize climate models. Much of the early work that formed the basis of the CGC project began at that time. You can find CLiMA's code on GitHub here.

CLiMA made the determination that old climate models, many of which were built 50 years ago and coded in Fortran, had to go if there was going to be any progress toward better climate models. Now that he's working at MIT on the CGC project, he's realized that "traditional climate models are in a language [MIT] students can't even read."

The language that CLiMA chose, and the one being used for the MIT project, is statistical modeling language Julia, which another CLiMA researcher described as a serious challenge "because Julia hadn't been used on such a big science project before." 

Ferrari said that the Julia gamble played off, leading to the current situation in which the team has built what it describes as a digital twin of the Earth that can simulate global climate conditions. Current models have low resolution: The smallest scale they can operate on is the 100-200 kilometer scale. Small-scale weather processes, like cloud cover, rainfall and sea ice, simply can't be accurately predicted.

CLiMA's model is able to include small-scale climate elements, which CLiMA scientists say are fundamental to understanding larger processes. Interactions between small and large-scale climates happens constantly, Ferrari said, and their exclusion means far less precision.

Precision on a global scale isn't the end goal, though: "We want to take this large-scale model and create what we call an 'emulator' that is only predicting a set of variables of interest, but it's been trained on the large-scale model," Ferrari said. 

Ferrari's emulators would be limited to a small portion of the planet, but because the so-called "digital cousins" are trained on the global model they can understand the ways that large and small climate elements interact. 

Climate modeling in your pocket?

The current way that climate models are run is inefficient, Selin points out, because "if you wanted to use output from a global climate model, you usually would have to use output that's designed for general use."

Part of Selin and Ferrari's project would be to democratize access to climate models through the use of emulators and by taking end-user needs into account from the very beginning. Ultimately, the team hopes digital cousins could be run on devices as small as a smartphone, though they admit that's beyond the scope of the current project.

Digital cousins of small Earth regions, the team said, would save local institutions time and money establishing their own climate models, and their accuracy would enable forecasters to model their region in real-time with a wider range of scenarios. 

Ultimately, Ferrari wants the project to create climate models that will be able to predict future events for which data doesn't exist. That opens up another challenge and researchers may need to spend some time on: "A new way of doing machine learning that learns from the data continually coming in and also takes into account the laws of physics and thermodynamics," MIT said. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more