Posted in | News | Atmospheric Robotics

New LLM Offers Accurate Weather Prediction

Scientists at the US Department of Energy’s (DOE) Argonne National Laboratory, in close collaboration with researchers Aditya Grover and Tung Nguyen at the University of California, Los Angeles, began developing large artificial intelligence (AI) models for weather forecasting, known as foundation models.

Illustration of a six-day forecast of 10-meter wind speed (color-fill) and mean sea level pressure (contours) using a high-resolution version of Stomer (HR-Stormer) run at 30-kilometer horizontal resolution.
Illustration of a six-day forecast of 10-meter wind speed (color-fill) and mean sea level pressure (contours) using a high-resolution version of Stomer (HR-Stormer) run at 30-kilometer horizontal resolution. Image Credit: Troy Arcomano/Argonne National Laboratory

The ability to build reliable weather models for forecasting is critical for all sectors of the American economy, from aviation to shipping. Weather models have traditionally relied on equations relating to thermodynamics and fluid dynamics in the atmosphere. These models are extremely computationally intensive and usually performed on large supercomputers.

Researchers from private sector companies such as Nvidia and Google have begun developing huge artificial intelligence (AI) models, known as foundation models, for weather prediction.

This model could offer more accurate forecasts than numerical weather prediction models at a lower computational cost.

Some of these models beat current models in prediction beyond seven days, providing scientists with an extra window into the weather.

Foundation models rely on “tokens,” little information AI algorithms use to learn the physics behind weather. Many foundation models are used for natural language processing, which involves managing words and phrases.

For these huge language models, tokens are words or chunks of language that the model predicts sequentially. Instead of tokens, this new weather forecast algorithm uses visuals — patches of charts that indicate humidity, temperature, and wind speed at various layers of the atmosphere.

Instead of being interested in a text sequence, you’re looking at spatial-temporal data, which is represented in images. When using these patches of images in the model, you have some notion of their relative positions and how they interact because of how they are tokenized.

Sandeep Madireddy, Computer Scientist, Argonne National Laboratory

Rao Kotamarthi, an Argonne atmospheric scientist, stated that the scientific team can make accurate predictions with relatively low-resolution data.

The philosophy of weather forecasting has for years been to get to higher resolutions for better forecasts. This is because you are able to resolve the physics more precisely, but of course, this comes at great computational cost. But we are finding now that we are actually able to get comparable results to existing high-resolution models even at coarse resolution with the method we are using.

Rao Kotamarthi, Senior Scientist, Environmental Science Division, Argonne National Laboratory

While credible near-term weather forecasting appears to be a near-term achievable objective using AI, applying the same method to climate modeling, which includes evaluating weather across time, offers a new barrier.

Kotamarthi added, “In theory, foundation models could also be used for climate modeling. However, there are more incentives for the private sector to pursue new approaches for weather forecasting than there are for climate modeling. Work on foundation models for climate modeling will likely continue to be the purview of the national labs and universities dedicated to pursuing solutions in the general public interest.

According to Troy Arcomano, an Argonne environmental scientist, one of the reasons climate modeling is so challenging is that the climate is constantly changing.

With the climate, we have gone from what had been a largely stationary state to a non-stationary state. This means that all of our statistics of the climate are changing with time due to the additional carbon in the atmosphere. That carbon is also changing the Earth’s energy budget. It is complicated to figure out numerically and we’re still looking for ways to use AI.

Troy Arcomano, Postdoctoral Fellow, Argonne National Laboratory

With the launch of Aurora, Argonne’s new exascale supercomputer, scientists will be able to train a massive AI-based model that can operate at extremely high resolutions.

We need an exascale machine to really be able to capture a fine-grained model with AI,” Kotamarthi added.

The model was performed on Polaris, a supercomputer at the Argonne Leadership Computing Facility, a DOE Office of Science user facility, and the study was supported by Argonne’s Leadership-Directed Research and Development Program.

Research that was based on the study won the “Tackling Climate Change with Machine Learning” workshop’s Best Paper Award. The workshop was held on May 10th, 2024, in Vienna, Austria, in connection with the International Conference on Learning Representation 2024.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.