Multi-model simulations for climate prediction

Explore top LinkedIn content from expert professionals.

  • View profile for Gopal Erinjippurath

    AI for capital markets 🌎 | Founder and CTO | Angel Investor

    8,403 followers

    Climate models have long struggled with coarse resolution, limiting precise climate risk insights. But AI-driven methods are now changing this, unlocking more detailed intelligence than traditional physics-based approaches. I recently spoke with a research scientist at Google Research who highlighted a promising new hybrid approach. This method combines physics-based General Circulation Models (GCMs) with AI refinement, significantly improving resolution. The process starts with Regional Climate Models (RCMs) anchoring physical consistency at ~45 km resolution. Then, it uses a diffusion model, R2-D2, to enhance output resolution to 9 km, making estimates more suitable for projecting extreme climate events. 🔥 About R2-D2 R2‑D2 (Regional Residual Diffusion-based Downscaling) is a diffusion model trained on residuals between RCM outputs and high-resolution targets. Conditioned on physical inputs like coarse climate fields and terrain, it rapidly generates high-res climate maps (~800 fields/hour on GPUs), complete with uncertainty estimates. ✅ Why this matters - Offers detailed projections of extreme climate events for precise risk quantification. - Delivers probabilistic forecasts, improving risk modeling and scenario planning. - Provides another high-resolution modeling approach, enriching ensemble strategies for climate risk projections. 👉 Read the full paper: https://lnkd.in/gU6qmZTR 👉 An excellent explainer blog: https://lnkd.in/gAEJFEV2 If your work involves climate risk assessment, adaptation planning, or quantitative modeling, how are you leveraging high-resolution risk projections?

  • View profile for Mike Pritchard

    Director of Climate Simulation Research at NVIDIA and Professor at UC Irvine

    3,305 followers

    Excited to share new NVIDIA Earth-2 research on km-scale global climate foundation modeling with generative AI. "Climate In a Bottle" avoids autoregression and leverages conditional diffusion for flexibility, multi-modal generation, and cascaded super-resolution.    Preprint: https://lnkd.in/ggYWnse9 Code: https://lnkd.in/grCEMrxb   I find it frankly mind-boggling to be able to synthesize a dozen ~13M-pixel atmospheric channels at planetary scales from only 200 kB of climate-controlling input boundary conditions. And mixing information from ERA5 reanalysis vs. ICON simulation modalities is very fun.   Acknowledging a fantastic team - Noah Brenowitz, Tao Ge, Akshay Subramaniam, Aayush Gupta, David Hall, Morteza Mardani, Arash Vahdat and Karthik Kashinath.

  • View profile for Yossi Matias

    Vice President, Google. Head of Google Research.

    54,276 followers

    Precipitation is one of the most challenging variables to accurately simulate in global climate models as it depends on small-scale physical processes. In our latest research published in 𝘚𝘤𝘪𝘦𝘯𝘤𝘦 𝘈𝘥𝘷𝘢𝘯𝘤𝘦𝘴, we describe an advancement in our hybrid atmospheric model, NeuralGCM, which now leverages AI trained directly on NASA satellite observations to improve global precipitation simulations. Key results of this work: 👉 Physics-AI Integration: The model combines a traditional fluid dynamics solver for large-scale processes with AI neural networks that learn to account for the effects of small-scale physics, specifically precipitation. 👉 Improved Extremes: NeuralGCM demonstrates significant improvements in capturing the intensity of the top 0.1% of extreme rainfall events, better representing heavy precipitation than many traditional models. 👉 Long-Term Accuracy: In multi-year simulations, the model achieved a 40% average error reduction over land compared to leading atmospheric models used in the latest Intergovernmental Panel on Climate Change (IPCC) report. 👉 Daily Patterns: It more accurately reproduces the timing of peak daily precipitation, which is critical for hydrology and agricultural planning. We are already seeing the value of this approach in the field. A partnership between the University of Chicago and the Indian Ministry of Agriculture recently used NeuralGCM in a pilot program to help predict the onset of the monsoon season. NeuralGCM is part of our Earth AI program to better understand the physical earth in ways that benefit society. We have made the code and model checkpoints openly available to the community. Read the full details on the Google Research blog by Janni Yuval: goo.gle/4qH63sU Paper: https://lnkd.in/d7E4US4W

  • View profile for Jorge Bravo Abad

    AI/ML for Science & DeepTech | Prof. of Physics at UAM | Author of “IA y Física” & “Ciencia 5.0”

    28,996 followers

    AI finds a missing equation for simulating atmospheric and oceanic turbulence Climate models and weather forecasts simulate turbulent flows spanning scales from thousands of kilometers down to meters. No computer can resolve all of them, so modelers approximate the effect of unresolved small scales on the large-scale dynamics. This approximation is known as a subgrid-scale closure—a model that "closes" the governing equations by filling in what the coarse grid cannot see. Getting closures right matters enormously: their shortcomings are a leading source of uncertainty in climate projections and extreme weather forecasts. For decades, the field has faced a trade-off. One family of closures faithfully reconstructs the small-scale stress patterns but makes simulations blow up. The other keeps simulations stable but oversimplifies the physics—removing too much energy, ignoring backscatter from small to large scales, and underestimating extreme events. Karan Jakhar, Yifei Guan, and Pedram Hassanzadeh break this impasse by changing what equation discovery optimizes for. Previous sparse regression searches consistently landed on the same second-order approximation known since the 1970s, which is accurate but unstable. The key insight: if you also require the discovered equation to reproduce how energy flows between scales—not just match local stress patterns—the algorithm finds something different. Searching 930 candidate terms with this physics-informed dual criterion, Bayesian sparse regression robustly identifies an additional fourth-order term in the Taylor expansion of the subgrid stress (NGM4). NGM4 achieves ~0.99 pattern correlation with reference data, produces stable simulations across four diverse 2D turbulence setups mimicking atmospheric and oceanic dynamics, and accurately captures both bulk statistics and rare extremes. Its coefficients depend only on grid resolution—no tuning for flow regime or Reynolds number—and it needs just 100 training snapshots. The most striking aspect: NGM4 could have been derived analytically decades ago, but because the source of the second-order instability was unclear, higher-order terms were never explored. It took sparse regression guided by the right physics to reveal that the missing piece had been hiding in plain sight. One takeaway that extends well beyond turbulence: the criterion you optimize for determines what you discover. Embedding the right physics into equation discovery can uncover interpretable, generalizable equations that purely data-driven approaches systematically miss. Paper: https://lnkd.in/exGQGaGc #MachineLearning #Turbulence #ClimateModeling #EquationDiscovery #AIforScience #LargeEddySimulation #GeophysicalFluidDynamics #SparseRegression #PhysicsInformedAI #SubgridModeling #DeepLearning #ComputationalPhysics #ExtremeEvents #WeatherPrediction #AIforClimate

  • View profile for Jozef Pecho

    Climate/NWP Model & Data Analyst at Floodar (Meratch), GOSPACE LABS | Predicting floods, protecting lives

    3,097 followers

    🌍 Climate scientists often face a trade-off: Global Climate Models (GCMs) are essential for long-term climate projections — but they operate at coarse spatial resolution, making them too crude for regional or local decision-making. To get fine-scale data, researchers use Regional Climate Models (RCMs). These add crucial spatial detail, but come at a very high computational cost, often requiring supercomputers to run for months. ➡️ A new paper introduces EnScale — a machine learning framework that offers an efficient and accurate alternative to running full RCM simulations. Instead of solving the complex physics from scratch, EnScale "learns" the relationship between GCMs and RCMs by training on existing paired datasets. It then generates high-resolution, realistic, and diverse regional climate fields directly from GCM inputs. What makes EnScale stand out? ✅ It uses a generative ML model trained with a statistically principled loss (energy score), enabling probabilistic outputs that reflect natural variability and uncertainty ✅ It is multivariate – it learns to generate temperature, precipitation, radiation, and wind jointly, preserving spatial and cross-variable coherence ✅ It is computationally lightweight – training and inference are up to 10–20× faster than state-of-the-art generative approaches ✅ It includes an extension (EnScale-t) for generating temporally consistent time series – a must for studying events like heatwaves or prolonged droughts This approach opens the door to faster, more flexible generation of regional climate scenarios, essential for risk assessment, infrastructure planning, and climate adaptation — especially where computational resources are limited. 📄 Read the full paper: EnScale: Temporally-consistent multivariate generative downscaling via proper scoring rules ---> https://lnkd.in/dQr5rmWU (code: https://lnkd.in/dQk_Jv8g) 👏 Congrats to the authors — a strong step forward for ML-based climate modeling! #climateAI #downscaling #generativeAI #machinelearning #climatescience #EnScale #RCM #GCM #ETHZurich #climatescenarios

Explore categories