Further progress in AI+climate modeling "Applying the ACE2 Emulator to SST Green's Functions for the E3SMv3 Global Atmosphere Model". Building on ACE2 model which uses our spherical Fourier neural operator (SFNO) architecture, this work shows that ACE2 can replicate climate model responses to sea surface temperature perturbations with high fidelity at a fraction of the cost. This accelerates climate sensitivity research and helps us better understand radiative feedbacks in the Earth system. Background: The SFNO architecture was first used in training FourCastNet weather model, whose latest version (v3) has state-of-art probabilistic calibration. AI+Science is not just about blindly applying the standard transformer/CNN "hammer". It is about carefully designing neural architectures that incorporate domain constraints like geometry and multiple scales, while being expressive and easy to train. SFNO accomplishes both: it incorporates multiple scales, and it respects the spherical geometry and this is critical for success in climate modeling. Unlike short-term weather, which requires only a few autoregressive steps for rollout, climate modeling requires long rollouts with thousands or even greater number of time steps. All other AI-based models fail for long-term climate modeling including Pangu and GraphCast which ignore the spherical geometry. Distortions start building up at the poles since the models assume domain is a rectangle, and they lead to catastrophic failures. Structure matters in AI+Science!
Science Forecasting Models
Explore top LinkedIn content from expert professionals.
-
-
There is a massive archive of flash flood data hidden inside the news. 📰 Unlike earthquakes, which are tracked by a global network of sensors, the world is living in a “data desert” when it comes to smaller, fast-moving events like flash floods. This data gap is a major hurdle for climate resilience. Without a historical baseline, it’s nearly impossible to predict future hazards at a local level. To solve this, my colleagues at Google Research are introducing a new approach called Groundsource. It uses Gemini to turn 25 years of news articles, government reports, and local bulletins into a historical flash flooding map. The team created a massive global archive of 2.6 million flood events across 150 countries. Using this data, they can provide near-global urban flash flood forecasts up to 24 hours before an event. These forecasts are now being rolled out in Google’s Flood Hub, our free and public platform that provides AI-driven flood forecasts to help communities stay safe. For me, the most exciting part is seeing how AI can transform the world’s “unstructured memory” into a robust scientific baseline. Manually extracting this information at scale would be impossible, but using AI to organize relevant historical data lets us build a more resilient future where communities everywhere can better prepare for flash floods. Learn more about how we’re turning the news into life-saving data. 👇 goo.gle/4ryKfzH
-
New paper – A foundation model for the Earth system Abstract “Reliable forecasting of the Earth system is essential for mitigating natural disasters and supporting human progress. Traditional numerical models, although powerful, are extremely computationally expensive. Recent advances in artificial intelligence (#AI) have shown promise in improving both predictive performance and efficiency, yet their potential remains underexplored in many Earth system domains. Here we introduce Aurora, a large-scale foundation model trained on more than one million hours of diverse geophysical data. Aurora outperforms operational forecasts in predicting air quality, ocean waves, tropical cyclone tracks and high-resolution #weather, all at orders of magnitude lower computational cost. With the ability to be fine-tuned for diverse applications at modest expense, Aurora represents a notable step towards democratizing accurate and efficient Earth system predictions. These results highlight the transformative potential of AI in environmental forecasting and pave the way for broader accessibility to high-quality #climate and #weather information.” Bodnar, C., Bruinsma, W.P., Lucic, A. et al. A foundation model for the Earth system. Nature 641, 1180–1187 (2025). https://lnkd.in/eh8wQ2wx
-
🌍✨ Revolutionizing Paleoclimate Studies with Physics-Informed Neural Networks (PINNs) ✨🌍 Excited to share insights from a groundbreaking study by Constanza A. Molina Catricheo, Fabrice Lambert, Julien Salomon, and Elwin van ’t Wout! Their work leverages Physics-Informed Neural Networks (PINNs) to reconstruct global maps of atmospheric dust deposition during the Holocene and Last Glacial Maximum (LGM). This innovative approach combines machine learning with physical modeling to address challenges like sparse and uneven paleoclimate data. Unlike traditional methods like kriging, which can produce physically unrealistic results, PINNs incorporate physical laws (like advection-diffusion equations) into the learning process. Here’s what stood out: ✅ Improved Physical Realism: PINNs captured dust plumes flowing with prevailing winds, avoiding unphysical upwind flows. ✅ Data-Sparse Regions: Achieved smoother, more accurate reconstructions in areas with limited measurements, such as southern oceans. ✅ Efficiency: Delivered high-quality results at a fraction of the computational cost of coupled climate simulations. 💡 Why it matters: Atmospheric dust plays a critical role in ecosystems and climate, influencing biogeochemical cycles and surface temperature. Enhancing our ability to model its distribution provides key insights into Earth's past and future climate. 📖 Curious to learn more? Check out their research: https://lnkd.in/dueWMJHy Let’s discuss—how do you see PINNs shaping the future of data-driven geosciences? 🚀 #PhysicsInformedNeuralNetworks #MachineLearning #Paleoclimate #Geosciences #ClimateModeling #Innovation
-
🌧️ Rainfall data analysis as a fundamental input for advanced hydrological modelling . Rainfall data is the governing variable in hydrological studies, as it directly affects the estimation of surface runoff, the hydrological response of basins, and the accuracy of mathematical model outputs used in flood risk assessment and water infrastructure design. 📊 The hydrological importance of rainfall analysis Accurate analysis of rainfall data aims to: Describe the statistical characteristics of rainfall (frequency, intensity, variability) Represent the temporal and spatial distribution of precipitation Identify design storms Reduce uncertainty in hydrological models. 🧠 Advanced statistical analysis of rainfall The choice of statistical method depends on the nature of the data and the length of the time series. The most prominent methods are: 🔹 Frequency Analysis Application of probability distributions such as: Gumbel Extreme Value Type I Log-Pearson Type III Generalised Extreme Value (GEV) Goodness of Fit test using: Kolmogorov–Smirnov Chi-Square Anderson–Darling. 🔹 Intensity-Duration-Frequency (IDF) Curves Derivation of mathematical relationships between intensity (I), duration (D), and frequency (T) Form the basis for the design of stormwater drainage networks and urban infrastructure. ⏱️ Temporal Analysis Time series analysis to detect: Long-term trends (Trend Analysis) Climate changes and their impact on precipitation patterns Use of tests: Mann–Kendall Sen’s Slope Estimator. 🌍 Spatial Rainfall Analysis Due to the heterogeneity of precipitation, rainfall is spatially represented using: Thiessen Polygons Inverse Distance Weighting (IDW) Kriging (Geostatistical Methods) Integration with geographic information systems (GIS) is an essential step in improving rainfall representation at the catchment level. 💧 Linking rainfall and hydrological models Rainfall analysis results are used directly in: Rational Method (for small basins with rapid response) SCS Curve Number Method for estimating loss and surface runoff Rainfall–Runoff Models such as: HEC-HMS WMS SWMM ⚠️ Technical challenges Incomplete or irregular rainfall records High spatial variability of storms The impact of climate change on the stability of statistical assumptions (Stationarity). Any hydrological model, regardless of its computational accuracy, remains dependent on the quality of the rainfall data analysis input into it. Rainfall analysis is not a preliminary step, but rather the essence of the entire hydrological process.
-
Precipitation is one of the most challenging variables to accurately simulate in global climate models as it depends on small-scale physical processes. In our latest research published in 𝘚𝘤𝘪𝘦𝘯𝘤𝘦 𝘈𝘥𝘷𝘢𝘯𝘤𝘦𝘴, we describe an advancement in our hybrid atmospheric model, NeuralGCM, which now leverages AI trained directly on NASA satellite observations to improve global precipitation simulations. Key results of this work: 👉 Physics-AI Integration: The model combines a traditional fluid dynamics solver for large-scale processes with AI neural networks that learn to account for the effects of small-scale physics, specifically precipitation. 👉 Improved Extremes: NeuralGCM demonstrates significant improvements in capturing the intensity of the top 0.1% of extreme rainfall events, better representing heavy precipitation than many traditional models. 👉 Long-Term Accuracy: In multi-year simulations, the model achieved a 40% average error reduction over land compared to leading atmospheric models used in the latest Intergovernmental Panel on Climate Change (IPCC) report. 👉 Daily Patterns: It more accurately reproduces the timing of peak daily precipitation, which is critical for hydrology and agricultural planning. We are already seeing the value of this approach in the field. A partnership between the University of Chicago and the Indian Ministry of Agriculture recently used NeuralGCM in a pilot program to help predict the onset of the monsoon season. NeuralGCM is part of our Earth AI program to better understand the physical earth in ways that benefit society. We have made the code and model checkpoints openly available to the community. Read the full details on the Google Research blog by Janni Yuval: goo.gle/4qH63sU Paper: https://lnkd.in/d7E4US4W
-
AI finds a missing equation for simulating atmospheric and oceanic turbulence Climate models and weather forecasts simulate turbulent flows spanning scales from thousands of kilometers down to meters. No computer can resolve all of them, so modelers approximate the effect of unresolved small scales on the large-scale dynamics. This approximation is known as a subgrid-scale closure—a model that "closes" the governing equations by filling in what the coarse grid cannot see. Getting closures right matters enormously: their shortcomings are a leading source of uncertainty in climate projections and extreme weather forecasts. For decades, the field has faced a trade-off. One family of closures faithfully reconstructs the small-scale stress patterns but makes simulations blow up. The other keeps simulations stable but oversimplifies the physics—removing too much energy, ignoring backscatter from small to large scales, and underestimating extreme events. Karan Jakhar, Yifei Guan, and Pedram Hassanzadeh break this impasse by changing what equation discovery optimizes for. Previous sparse regression searches consistently landed on the same second-order approximation known since the 1970s, which is accurate but unstable. The key insight: if you also require the discovered equation to reproduce how energy flows between scales—not just match local stress patterns—the algorithm finds something different. Searching 930 candidate terms with this physics-informed dual criterion, Bayesian sparse regression robustly identifies an additional fourth-order term in the Taylor expansion of the subgrid stress (NGM4). NGM4 achieves ~0.99 pattern correlation with reference data, produces stable simulations across four diverse 2D turbulence setups mimicking atmospheric and oceanic dynamics, and accurately captures both bulk statistics and rare extremes. Its coefficients depend only on grid resolution—no tuning for flow regime or Reynolds number—and it needs just 100 training snapshots. The most striking aspect: NGM4 could have been derived analytically decades ago, but because the source of the second-order instability was unclear, higher-order terms were never explored. It took sparse regression guided by the right physics to reveal that the missing piece had been hiding in plain sight. One takeaway that extends well beyond turbulence: the criterion you optimize for determines what you discover. Embedding the right physics into equation discovery can uncover interpretable, generalizable equations that purely data-driven approaches systematically miss. Paper: https://lnkd.in/exGQGaGc #MachineLearning #Turbulence #ClimateModeling #EquationDiscovery #AIforScience #LargeEddySimulation #GeophysicalFluidDynamics #SparseRegression #PhysicsInformedAI #SubgridModeling #DeepLearning #ComputationalPhysics #ExtremeEvents #WeatherPrediction #AIforClimate
-
After more than 10 years of experience in hydrology, I still see surface runoff as the most visible and misunderstood part of the hydrological cycle. As rain falls on the land, some evaporates, some infiltrates, and some recharges groundwater. The remaining water flows over the surface as runoff. This simple process controls floods, soil erosion, reservoir inflow, urban drainage, and water quality. Understanding runoff means understanding how a catchment responds to climate, land use, and human activity. How surface runoff forms Runoff is generated when: • Rainfall intensity exceeds infiltration capacity • Soil becomes saturated • Land is sealed by roads and buildings • Slopes accelerate overland flow This is why rainfall alone never tells the full story. Simple ways to estimate runoff: For students, consultants, and early-career hydrologists, these methods still matter: • Runoff coefficient method • Rational method • SCS Curve Number method • Water balance approach • Infiltration index methods (phi and W index) • Unit hydrograph method • Regional empirical equations • Time of concentration-based estimates • Excel-based rainfall runoff calculations Simple does not mean wrong. Many design decisions rely on these methods every day. Widely used hydrological models When scale and complexity increase, models help us organize the hydrological cycle: • HEC-HMS for event-based flood modeling • SWAT for long-term basin-scale runoff and land use studies • MIKE SHE and MIKE 11 for integrated surface and groundwater analysis • VIC and TOPMODEL for regional and terrain-driven runoff processes • IHACRES for data-limited catchments Each model is a tool. None is universal. AI and machine learning in runoff estimation Data-driven methods are now common, especially for forecasting: • Artificial Neural Networks • Random Forest and Decision Trees • Support Vector Machines • Deep learning models such as LSTM They can predict runoff well but often explain little. Physical understanding still matters. A simple rule from experience Start simple. Match the method to your data. Always verify a model against real data. Surface runoff is not just a number. It is the heartbeat of a watershed and the link between climate, land, and society. If you work in water, you work with runoff, whether you realize it or not. #SurfaceRunoff #Hydrology #RainfallRunoff #HydrologicalCycle #WatershedHydrology #HECHMS #SWATModel #HydrologicalModeling #RunoffModeling #FloodModeling #HydrologyAndAI #MachineLearningInHydrology #AIForWater #DataDrivenHydrology #WaterResources #ClimateChangeImpacts #FloodRisk #SustainableWater #WaterSecurity #WaterProfessionals #HydrologyStudents #EnvironmentalEngineering #SWAT #HEC-HMS #AI #Sustainability #Flood #CivilEngineering #ResearchAndPractice #STEM #ScienceCommunication #KnowledgeSharing #LearningEveryday #CFBR
-
I'm incredibly proud to share a major milestone from our Flood Forecasting team at Google Research today. We are taking a big step forward in addressing one of the most destructive and difficult-to-predict natural disasters: urban flash floods. For years, predicting flash floods in cities has been limited by a fundamental challenge in hydrology: the lack of historical, on-the-ground data. To solve this, our team developed Groundsource, which is a novel methodology that leverages Gemini to analyze millions of public news reports, transforming them into a high-quality, actionable dataset for crisis prediction. Using Groundsource, we’ve trained ML models capable of delivering flash flood forecasts for urban areas. This is a massive expansion of our global flood alerting capabilities and a crucial step toward helping vulnerable communities prepare for extreme weather. You can read all about the science, the system, and our vision for disaster resilience in three new posts published today: 1️⃣ The big picture from Yossi Matias on how Groundsource is boosting disaster resilience: 🔗 https://lnkd.in/eqQQJ4y6 2️⃣ The technical deep-dive into Groundsource and how we use Gemini to turn news into scientific data: 🔗 https://lnkd.in/e-Agtmwp 3️⃣ How we are applying this data to power AI-driven flash flood forecasting in cities around the world: 🔗 https://lnkd.in/eQdmMwR3 A huge congratulations to Oleg Zlydenko, Deborah Cohen, Rotem Mayo, Frederik Kratzert and everyone across the team who made this possible. It's a privilege to work alongside this group to translate fundamental AI research into life-saving early warning systems. #AIforGood #FloodForecasting #Hydrology #MachineLearning #Gemini #GoogleResearch #ClimateAdaptation #EarlyWarningSystems
-
Forecasting Floods Smarter, Faster, and More Reliably — with AI. How a hybrid deep learning system outperformed both physics-based and global AI flood models. Floods are among the world’s most damaging and frequent natural disasters — and as climate extremes intensify, accurate forecasting is more important than ever. A new study led by Vinh Ngoc Tran presents a major advance in this area: a deep learning framework called Errorcastnet (ECN), which blends physics-based hydrological modeling with artificial intelligence. Instead of replacing traditional models, ECN learns from their errors — specifically, from the U.S. National Water Model (NWM) — and delivers faster, more accurate flood forecasts at a continental scale. What sets ECN apart? ➡️ 4–6× more accurate predictions (1–10 days ahead) ➡️ Superior uncertainty estimation & reliability ➡️ Up to 4× greater economic decision-making value ➡️ Ensemble forecasts in minutes, not hours ➡️ Outperforms Google’s global AI flood model, especially in complex watersheds with human intervention (dams, reservoirs, diversions) ✅ The key innovation? Hybrid modeling — combining the strengths of physics-based science with the flexibility and speed of machine learning. This approach supports early warning systems that are not just fast, but trustworthy — a critical step toward climate resilience, emergency planning, and long-term water security. ✅ Code is open-source and available here: https://lnkd.in/dQBEyFPb ✅ Full paper: AI Improves the Accuracy, Reliability, and Economic Value of Continental-Scale Flood Predictions: https://lnkd.in/dt6ise8w This work is a powerful example of how AI can augment — not replace — scientific models to create smarter tools for the real world. #AI #FloodForecasting #Hydrology #ClimateTech #DisasterPreparedness #EarlyWarningSystems #MachineLearning #ClimateAdaptation #OpenScience #WaterResources #ResearchToPractice
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development