Climate metrics prediction tools

Explore top LinkedIn content from expert professionals.

Summary

Climate metrics prediction tools use advanced technologies, including artificial intelligence and satellite data, to forecast important climate variables like rainfall, temperature, and drought risk. These tools help scientists, organizations, and policymakers turn massive amounts of climate and weather data into actionable insights and make better decisions for planning and environmental management.

  • Embrace open access: Take advantage of freely available AI models and climate datasets to support weather forecasting, risk analysis, and environmental planning.
  • Simplify data analysis: Use tools that convert complex climate data into easy-to-use formats, allowing you to join different datasets and identify climate risks quickly.
  • Improve prediction accuracy: Adopt AI-driven methods for bias correction and high-resolution climate forecasting to support reliable decision-making and policy development.
Summarized by AI based on LinkedIn member posts
  • You might have seen news from our Google DeepMind colleagues lately on GenCast, which is changing the game of weather forecasting by building state-of-the-art weather models using AI. Some of our teams started to wonder – can we apply similar techniques to the notoriously compute-intensive challenge of climate modeling? General circulation models (GCMs) are a critical part of climate modeling, focused on the physical aspects of the climate system, such as temperature, pressure, wind, and ocean currents. Traditional GCMs, while powerful, can struggle with precipitation – and our teams wanted to see if AI could help. Our team released a paper and data on our AI-based GCM, building on our Nature paper from last year - specifically, now predicting precipitation with greater accuracy than prior state of the art. The new paper on NeuralGCM introduces 𝗺𝗼𝗱𝗲𝗹𝘀 𝘁𝗵𝗮𝘁 𝗹𝗲𝗮𝗿𝗻 𝗳𝗿𝗼𝗺 𝘀𝗮𝘁𝗲𝗹𝗹𝗶𝘁𝗲 𝗱𝗮𝘁𝗮 𝘁𝗼 𝗽𝗿𝗼𝗱𝘂𝗰𝗲 𝗺𝗼𝗿𝗲 𝗿𝗲𝗮𝗹𝗶𝘀𝘁𝗶𝗰 𝗿𝗮𝗶𝗻 𝗽𝗿𝗲𝗱𝗶𝗰𝘁𝗶𝗼𝗻𝘀. Kudos to Janni Yuval, Ian Langmore, Dmitrii Kochkov, and Stephan Hoyer! Here's why this is a big deal: 𝗟𝗲𝘀𝘀 𝗕𝗶𝗮𝘀, 𝗠𝗼𝗿𝗲 𝗔𝗰𝗰𝘂𝗿𝗮𝗰𝘆: These new models have less bias, meaning they align more closely with actual observations – and we see this both for forecasts up to 15 days, and also for 20-year projections (in which sea surface temperatures and sea ice were fixed at historical values, since we don’t yet have an ocean model). NeuralGCM forecasts are especially performant around extremes, which are especially important in understanding climate anomalies, and can predict rain patterns throughout the day with better precision. 𝗖𝗼𝗺𝗯𝗶𝗻𝗶𝗻𝗴 𝗔𝗜, 𝗦𝗮𝘁𝗲𝗹𝗹𝗶𝘁𝗲 𝗜𝗺𝗮𝗴𝗲𝗿𝘆, 𝗮𝗻𝗱 𝗣𝗵𝘆𝘀𝗶𝗰𝘀: The model combines a learned physics model with a dynamic differentiable core to leverage both physics and AI methods, with the model trained directly on satellite-based precipitation observations. 𝗢𝗽𝗲𝗻 𝗔𝗰𝗰𝗲𝘀𝘀 𝗳𝗼𝗿 𝗘𝘃𝗲𝗿𝘆𝗼𝗻𝗲! This is perhaps the most exciting news! The team has made their pre-trained NeuralGCM model checkpoints (including their awesome new precipitation models) available under a CC BY-SA 4.0 license. Anyone can use and build upon this cutting-edge technology! https://lnkd.in/gfmAx_Ju 𝗪𝗵𝘆 𝗧𝗵𝗶𝘀 𝗠𝗮𝘁𝘁𝗲𝗿𝘀: Accurate predictions of precipitation are crucial for everything from water resource management and flood mitigation to understanding the impacts of climate change on agriculture and ecosystems. Check out the paper to learn more:  https://lnkd.in/geqaNTRP

  • View profile for Benny Istanto, GISP

    Exploring #climate with #GIS and #datascience, solving old problems in new ways.

    2,734 followers

    To support regional economic monitoring and risk assessments at work, I regularly process global climate datasets (#CHIRPS, #TerraClimate, #ERA5Land, #IMERG) to track extreme #dry and #wet periods. For years, I relied on existing tools to produce these indices, but as our scale grew, I often hit bottlenecks in error handling and processing efficiency. I needed a solution that was minimal, operationally ready, and capable of handling global-scale data without requiring a supercomputer. So, I built precip-index. It’s a specialized #Python implementation of #SPI (Standardized Precipitation Index) and #SPEI (Standardized Precipitation Evapotranspiration Index) designed for production workflows. Key features for the geospatial community: - Bidirectional Analysis: Monitors both #drought and wet (#flood) conditions using a unified framework. - Operational Mode: Calibrate once, save parameters, and apply them to new data instantly, perfect for periodic reporting. - Scalable: Benchmarked on CHIRPS v3 global data (17M+ grid cells) with memory-efficient tiling. - Multi-Distribution: Supports Gamma, Pearson III, and Log-Logistic fitting. This code stands on the shoulders of giants; it is built upon the foundation of the `climate-indices` library by James Adams, with a focus on optimizing it for operational speed, memory efficiency, and specific run-theory analysis. Built with a heavy dose of #Claude #VibeCoding, enabling a climate geographer like me to build robust, operational tools. I hope this implementation proves useful to others working on climate resilience and data analysis. Check out the documentation (built using #Quarto) and code: https://lnkd.in/gAkwE4fR

  • View profile for Yunsoo Choi

    Professor at UH (Air Quality/Weather/Climate Forecasting, Deep (Machine) Learning, Digital Twin)

    4,588 followers

    Deveshwar Singh has successfully defended his Ph.D. and is my 21st doctoral student to earn the Ph.D. degree under my supervision at the University of Houston. His dissertation, titled “Utilization of Deep-Learning Algorithms for Bias Correction and Forecasting of Weather, Air Quality, and Climate Parameters over Regional Scales,” develops and applies deep learning frameworks to enhance prediction and bias correction of key environmental variables across regional scales. In the first study, he addresses systematic biases in Indian Summer Monsoon Rainfall projections from CORDEX-SA regional climate models using two super-resolution methods—Autoencoder-Decoder and Residual Neural Network (ResNet)—which ingest eight meteorological variables plus elevation at 0.25° resolution and generate bias-corrected precipitation at 0.05° resolution; the ResNet model achieves a five-fold increase in spatial resolution and reduces extreme rainfall bias from 21.18 mm to -7.86 mm. The second study proposes the Deep-BCSI framework, which combines CNN-based bias correction with Partial CNN-based spatial imputation to improve 72-hour PM₂.₅ forecasts over South Korea, increasing the Index of Agreement from 0.65–0.68 (CMAQ) to 0.71–0.80 and lowering RMSE by 25%–41% in metropolitan regions, with SHAP analysis confirming behavior consistent with atmospheric chemistry and meteorological processes. The third study introduces the Spatial-Temporal Attention ResNet Transformer (START), which integrates 17 meteorological variables with spatial and temporal encodings to outperform the NASA GEOS baseline over the contiguous United States—reducing RMSE for temperature by 30.3% and MAE for relative humidity by 32.2%—while SHAP-based interpretation and Monte Carlo Dropout with calibrated uncertainty provide reliable, well-explained predictions that support environmental management and policy decisions.

  • View profile for Vikram Gundeti

    CTO - Foursquare, Founding Engineer - Amazon Alexa

    7,484 followers

    𝗪𝗵𝗮𝘁 𝗶𝗳 𝘆𝗼𝘂 𝗰𝗼𝘂𝗹𝗱 𝗷𝘂𝘀𝘁 𝗰𝗵𝗮𝘁 𝘄𝗶𝘁𝗵 𝘀𝗮𝘁𝗲𝗹𝗹𝗶𝘁𝗲 𝗱𝗮𝘁𝗮 𝘁𝗼 𝘂𝗻𝗰𝗼𝘃𝗲𝗿 𝗰𝗹𝗶𝗺𝗮𝘁𝗲 𝗿𝗶𝘀𝗸𝘀? 𝗡𝗼𝘄 𝘆𝗼𝘂 𝗰𝗮𝗻 𝘄𝗶𝘁𝗵 𝗙𝗦𝗤 𝗦𝗽𝗮𝘁𝗶𝗮𝗹 𝗔𝗴𝗲𝗻𝘁. The most critical climate datasets — heatwave projections, precipitation models, land surface temperature, drought indices — live as 𝗿𝗮𝘀𝘁𝗲𝗿 𝗱𝗮𝘁𝗮: dense grids of pixel values derived from satellite sensors and climate models. Turning them into actionable intelligence requires specialized GIS tooling, resampling pipelines, CRS transformations, and significant engineering overhead before joining them with contextual data like population density or land use. This friction is why climate risk analysis has historically been slow, expensive, and inaccessible outside specialist teams. 𝗙𝗦𝗤 𝗛3 𝗛𝘂𝗯 𝗰𝗵𝗮𝗻𝗴𝗲𝘀 𝘁𝗵𝗮𝘁 𝗲𝗾𝘂𝗮𝘁𝗶𝗼𝗻 𝗲𝗻𝘁𝗶𝗿𝗲𝗹𝘆. 𝗙𝗦𝗤 𝗦𝗽𝗮𝘁𝗶𝗮𝗹 𝗔𝗴𝗲𝗻𝘁 𝗽𝘂𝘁𝘀 𝗶𝘁 𝘁𝗼 𝘄𝗼𝗿𝗸. FSQ H3 Hub's proprietary indexing pipeline converts raw raster datasets into 𝗛3 𝗵𝗲𝘅𝗮𝗴𝗼𝗻𝗮𝗹 𝗰𝗲𝗹𝗹𝘀 — making satellite-derived climate data available in clean, tabular form at a consistent spatial resolution. Every dataset shares the same H3 grid, so joining a Copernicus heatwave projection with a CHELSA precipitation model, a wildfire risk layer, and population density becomes a simple SQL join on a cell ID. No resampling. No CRS headaches. No bespoke ETL. 𝗙𝗦𝗤 𝗦𝗽𝗮𝘁𝗶𝗮𝗹 𝗔𝗴𝗲𝗻𝘁, built on this foundation, lets you converse with that unified data layer to surface climate insights at scale. 𝗦𝗼 𝘄𝗲 𝗽𝘂𝘁 𝗶𝘁 𝘁𝗼 𝘁𝗵𝗲 𝘁𝗲𝘀𝘁: "Do a temporal climate risk analysis for Europe — pick an area with the most interesting future climate impacts." The agent selected 𝗔𝗻𝗱𝗮𝗹𝘂𝘀𝗶𝗮, 𝗦𝗼𝘂𝘁𝗵𝗲𝗿𝗻 𝗦𝗽𝗮𝗶𝗻 — citing Mediterranean climate sensitivity, agricultural economy, water scarcity, and dense coastal populations. It tessellated the region into 113,437 𝗛3 𝗰𝗲𝗹𝗹𝘀 at resolution 8 (~460m), drawing on five datasets spanning climate projections (Copernicus RCP 8.5, CHELSA SSP370), environmental risk (Drivers of Forest Loss), population exposure (Global Population 2020), and land use (MODIS Land Cover). 𝗪𝗵𝗮𝘁 𝘁𝗵𝗲 𝗮𝗻𝗮𝗹𝘆𝘀𝗶𝘀 𝗿𝗲𝘃𝗲𝗮𝗹𝗲𝗱: By late century, Andalusia faces a compounding climate trajectory: +23.7 heatwave days/year in extreme risk zones (32% of the region); +2.13°C average warming by 2070–2100; 53% projected "Extremely Drier" with over 600mm precipitation loss; 5,519 high-risk cells with significant population exposure; and a wildfire-climate feedback loop accelerating vegetation loss and further warming. 𝗥𝗮𝘀𝘁𝗲𝗿 𝗱𝗮𝘁𝗮 𝗮𝘁 𝘁𝗵𝗲 𝘀𝗽𝗲𝗲𝗱 𝗼𝗳 𝗶𝗻𝘀𝗶𝗴𝗵𝘁. The world's most detailed climate record is largely trapped in formats accessible only to specialists. FSQ H3 Hub and FSQ Spatial Agent change that — delivering climate risk intelligence that scales as fast as the questions you can ask. Download Foursquare 𝗦𝗽𝗮𝘁𝗶𝗮𝗹 𝗗𝗲𝘀𝗸𝘁𝗼𝗽 to get started.

  • View profile for Andreas Horn

    Head of AIOps @ IBM || Speaker | Lecturer | Advisor

    242,216 followers

    𝗔𝗜 𝗳𝗼𝗿 𝗚𝗢𝗢𝗗: 𝗡𝗔𝗦𝗔 𝗮𝗻𝗱 𝗜𝗕𝗠 𝗹𝗮𝘂𝗻𝗰𝗵 𝗼𝗽𝗲𝗻-𝘀𝗼𝘂𝗿𝗰𝗲 𝗔𝗜 𝗳𝗼𝘂𝗻𝗱𝗮𝘁𝗶𝗼𝗻 𝗺𝗼𝗱𝗲𝗹 𝗳𝗼𝗿 𝗺𝗼𝗿𝗲 𝗲𝗳𝗳𝗶𝗰𝗶𝗲𝗻𝘁 𝘄𝗲𝗮𝘁𝗵𝗲𝗿 𝗮𝗻𝗱 𝗰𝗹𝗶𝗺𝗮𝘁𝗲 𝗳𝗼𝗿𝗲𝗰𝗮𝘀𝘁𝗶𝗻𝗴! 🌍 (𝗧𝗵𝗶𝘀 𝗶𝘀 𝘄𝗵𝗮𝘁 𝘀𝗵𝗼𝘂𝗹𝗱 𝗴𝗲𝘁 𝗺𝗼𝗿𝗲 𝘀𝗽𝗼𝘁𝗹𝗶𝗴𝗵𝘁 𝗽𝗹𝗲𝗮𝘀𝗲 𝗮𝗻𝗱 𝗡𝗢𝗧 𝘁𝗵𝗲 𝗻𝗲𝘅𝘁 𝗖𝗵𝗮𝘁𝗚𝗣𝗧 𝗪𝗿𝗮𝗽𝗽𝗲𝗿!) In collaboration with NASA, IBM just launched Prithvi WxC an open-source, general-purpose AI model for weather and climate-related applications. And the truly remarkable part is that this model can run on a desktop computer. 𝗛𝗲𝗿𝗲'𝘀 𝘄𝗵𝗮𝘁 𝘆𝗼𝘂 𝗻𝗲𝗲𝗱 𝘁𝗼 𝗸𝗻𝗼𝘄: ⬇️ → The Prithvi WxC model (2.3-billion parameter) can create six-hour-ahead forecasts as a “zero-shot” skill – meaning it requires no tuning and runs on readily available data. → This AI model is designed to be customized for a variety of weather applications, from predicting local rainfall to tracking hurricanes or improving global climate simulations. → The model was trained using 40 years of NASA’s MERRA-2 data and can now be quickly tuned for specific use cases. And unlike traditional climate models that require massive supercomputers, this one operates on a desktop. Uniqueness lies in the ability to generalize from a small, high-quality sample of weather data to entire global forecasts. → This AI-powered model outperforms traditional numerical weather prediction methods in both accuracy and speed, producing global forecasts up to 10 days in advance within minutes instead of hours. → This model has immense potential for various applications, from downscaling high-resolution climate data to improving hurricane forecasts and capturing gravity waves. It could also help estimate the extent of past floods, forecast hurricanes, and infer the intensity of past wildfires from burn scars. It will be exciting to see what downstream apps, use cases, and potential applications emerge. What’s clear is that this AI foundation model joins a growing family of open-source tools designed to make NASA’s vast collection of satellite, geospatial, and Earth observational data faster and easier to analyze. With decades of observations, NASA holds a wealth of data, but its accessibility has been limited — until recently. This model is a big step toward democratizing data and making it more accessible to all. 𝗔𝗻𝗱 𝘁𝗵𝘀 𝗶𝘀 𝘆𝗲𝘁 𝗮𝗻𝗼𝘁𝗵𝗲𝗿 𝗽𝗿𝗼𝗼𝗳 𝘁𝗵𝗮𝘁 𝘁𝗵𝗲 𝗳𝘂𝘁𝘂𝗿𝗲 𝗼𝗳 𝗔𝗜 𝗶𝘀 𝗼𝗽𝗲𝗻, 𝗱𝗲𝗰𝗲𝗻𝘁𝗿𝗮𝗹𝗶𝘇𝗲𝗱, 𝗮𝗻𝗱 𝗿𝘂𝗻𝗻𝗶𝗻𝗴 𝗮𝘁 𝘁𝗵𝗲 𝗲𝗱𝗴𝗲. 🌍 🔗 Resources: Download the models from the Hugging Face repository: https://lnkd.in/gp2zmkSq Blog post: https://ibm.co/3TDul9a Research paper: https://ibm.co/3TAILXG #AI #ClimateScience #WeatherForecasting #OpenSource #NASA #IBMResearch

  • View profile for Mark Trexler
    Mark Trexler Mark Trexler is an Influencer

    Nearly 40 Years in Climate Risk | Founder of First U.S. Business Climate Risk Consultancy | Now Building the Climate Knowledge Tools We Need | Creator of ClimateNotebooks.net - For Anyone Taking Climate Change Seriously

    15,667 followers

    The new Copernicus Climate Atlas is an amazing tool for do-it-yourself climate forecasting. Far more powerful than anything most people have had (free) access to before. I recently spent quite a bit of time with the Climate Atlas, and even found a significant bug in the software (what were the odds! 😂). Happily the Copernicus team moved quickly to fix the problem. Because it is such a powerful tool that anyone can quickly learn to use, I've added a "Copernicus Climate Atlas Mini-Course" to our Climate Change Toolbox, walking through how to start using the Atlas (which I know can seem a bit intimidating at first). I've exported that Mini-Course as a PDF, which is what is attached below. Feedback and suggestions welcome!

  • View profile for Debbie W.
    Debbie W. Debbie W. is an Influencer

    President of Google in Europe, the Middle East, and Africa. Helping people across EMEA achieve their ambitions, big and small, through high impact technology.

    56,802 followers

    We know the Earth is getting warmer, but not what it means specifically for different regions. To figure this out, scientists do climate modelling. 🔎 🌍 , Google Research has published groundbreaking advancements in climate prediction using the power of #AI! Typically, researchers use "climate modelling" to understand the regional impacts of climate change, but current approaches have large uncertainty. Introducing NeuralGCM: a new atmospheric model that outperforms existing models by combining AI with physics-based modelling for improved accuracy and efficiency. Here’s why it stands out: ✅ More Accurate Simulations When predicting global temperatures and humidity for 2020, NeuralGCM had 15-50% less error than the state-of-the-art model "X-SHiELD". ✅ Faster Results NeuralGCM is 3,500 times quicker than X-SHiELD. If researchers simulated a year of the Earth's atmosphere with X-SHiELD, it would take 20 days to complete —  whereas NeuralGCM achieves this in just 8 minutes. ✅ Greater Accessibility Google Research has made NeuralGCM openly available on GitHub for non-commercial use, allowing researchers to explore, test ideas, and improve the model’s functionality. The research showcases AI’s ability to help deliver more accurate, efficient, and accessible climate predictions, which is critical to navigating a changing global climate. Read more about the team’s groundbreaking research in   Nature Portfolio’s  latest article! → https://lnkd.in/e-Etb_x4 #AIforClimateAction #Sustainability #AI

  • View profile for Hannes Matt

    Climate & nature-related risk manager | Climate & nature tech startup advisor

    23,369 followers

    ⛈️ 𝐂𝐥𝐢𝐦𝐚𝐭𝐞 𝐑𝐢𝐬𝐤 𝐌𝐞𝐭𝐡𝐨𝐝𝐨𝐥𝐨𝐠𝐲 𝐁𝐚𝐬𝐞𝐝 𝐨𝐧 𝐎𝐩𝐞𝐧-𝐀𝐜𝐜𝐞𝐬𝐬 𝐓𝐨𝐨𝐥𝐬 🗺️ Over the past months, I shared lists of open-access climate and nature risk assessment tools. They sparked quite some interest. Here’s how I thought I might provide additional value: ➡️ A practical Excel methodology for assessing climate risk based on open-access geospatial tools. For every risk category required by the EU Taxonomy, the Excel links to the best assessment tool. 🔥🌡️ This initial release focuses on temperature-related physical risks like heat stress and wildfires. Updates on additional risk categories are forthcoming. 𝐖𝐡𝐚𝐭’𝐬 𝐢𝐧𝐬𝐢𝐝𝐞: 🗺️ Open-access geospatial tools for assessing each temperature-related risk 📊 A conclusive methodology to assess company sites and supply chains 📝 Additional guidance for smooth assessment and reporting in line with EU Taxonomy and CSRD, including descriptions and instructions for each tool 📈 Based on the latest climate models and data by organizations like the IPCC. I hope this will save ESG teams substantial time and money in their search for adequate data and methods. 𝐈𝐧𝐭𝐞𝐫𝐞𝐬𝐭𝐞𝐝 𝐢𝐧 𝐭𝐡𝐞 𝐫𝐞𝐬𝐨𝐮𝐫𝐜𝐞? Comment below, and I’ll send it your way. (Please connect so I can message you directly.)

  • View profile for Paris Perdikaris

    Associate Professor, University of Pennsylvania

    4,043 followers

    Excited to announce the public code release of Aurora - a foundation model for atmospheric forecasting! 🌍 ⛅ Code: https://lnkd.in/dsaGx_hr Docs: https://lnkd.in/d7TQ-HTN Paper: https://lnkd.in/dBvgfPDG Aurora sets a new state-of-the-art in global weather and air quality prediction, outperforming traditional numerical models while being orders of magnitude faster. Key features: • Pretrained on diverse atmospheric data. • Fine-tuned versions for weather and air quality. • 0.1° resolution global forecasts. • Outperforms IFS-HRES and GraphCast on most metrics. The repo currently includes: • Pretrained model weights. • Fine-tuned weights for high-res weather forecasting. • Easy-to-use Python API. • Detailed documentation and examples. • Get started now with a simple example that runs Aurora on ERA5: https://lnkd.in/dnV5rR_V We hope this accelerates research into foundation models for Earth system prediction. Read the full paper here: https://lnkd.in/dBvgfPDG. Amazing effort by Cristian Bodnar, Wessel B., Ana Lucic and Megan Stanley at Microsoft Research AI for Science. #MachineLearning #WeatherForecasting

  • View profile for Asif Razzaq

    Founder @ Marktechpost (AI Dev News Platform) | 1 Million+ Monthly Readers

    35,056 followers

    NVIDIA Revolutionizes Climate Tech with ‘Earth-2’: The World’s First Fully Open Accelerated AI Weather Stack In a move that democratizes climate science, NVIDIA unveiled 3 groundbreaking new models powered by novel architectures: Atlas, StormScope, and HealDA. These tools promise to accelerate forecasting speeds by orders of magnitude while delivering accuracy that rivals or exceeds traditional methods. The suite includes three new breakthrough models: Earth-2 Medium Range: High-accuracy 15-day forecasts across 70+ variables.  Earth-2 Nowcasting: Generative AI that delivers kilometer-scale storm predictions in minutes.  Earth-2 Global Data Assimilation: Real-time snapshots of global atmospheric conditions. Full analysis: https://lnkd.in/gt_BugDZ Model weight: https://lnkd.in/gkUVqH5E Paper [Earth-2 Medium Range]: https://lnkd.in/gTf-f_Gd Paper [Earth-2 Nowcasting]: https://lnkd.in/gQf7muqz Paper [Earth-2 Global Data Assimilation]: https://lnkd.in/gu_-eZsn Technical details: https://lnkd.in/gPQ66Me2 NVIDIA NVIDIA AI

Explore categories