I just published a research paper that challenges how we model risk. And the result will make most project managers uncomfortable. ↓ Standard Monte Carlo assumes risks fire independently. They don't. They fire in chains. Risk A delays procurement. Procurement delay pushes mobilisation. Mobilisation delay compresses testing. Compressed testing forces rework. Rework blows contingency. That's not bad luck. That's a cascade. And your risk register cannot see it. ━━━━━━━━━━━━━━━━ I spent months building a framework to model exactly this — Probabilistic Chain Analysis (PCA). The result from a UK highways case study: ▸ One pre-mobilisation intervention ▸ £250,000 reduction in P90 cost exposure ▸ £15,000 management cost ▸ 16.7x return on risk management effort Not because we worked harder. Because we looked at the right node. ━━━━━━━━━━━━━━━━ The methodology: → Map risks as a Directed Acyclic Graph (not a flat register) → Assign conditional probabilities using Bayesian Networks → Run coupled Monte Carlo simulation → Identify cascade lift factor — which node is amplifying everything? → Intervene there. Not everywhere. ━━━━━━━━━━━━━━━━ The paper is 27 pages. Open access. No paywall. Validated across 16,000 infrastructure projects across 8 sectors. UK highways. Solar EPC. Hospitals. Power plants. Same pattern every time. The most dangerous risk isn't the most probable one. It's the most connected one. ━━━━━━━━━━━━━━━━ 📄 Full paper (free): in comments ━━━━━━━━━━━━━━━━ Have you ever seen a cascade take down a project that looked fine on paper? Drop it in the comments. I read every one. #ProjectManagement #RiskManagement #MonteCarlo #BayesianNetworks #Infrastructure #EPC #ProjectControls #PMP #Quantitative
Bayesian Analysis in Engineering
Explore top LinkedIn content from expert professionals.
Summary
Bayesian analysis in engineering is a statistical approach that uses prior knowledge and new data to continuously update and refine probability estimates, helping engineers make smarter decisions even when data is scarce or uncertain. This method is increasingly used in project risk management, reliability prediction, and parameter estimation to address complex challenges where traditional techniques struggle.
- Update as you learn: Continuously adjust your probability estimates by integrating new evidence, so your decisions reflect the latest information instead of static assumptions.
- Start with expert input: Begin your analysis with reasonable baseline estimates from expert judgment or similar past cases, then refine these as more data becomes available.
- Focus on connections: In risk management, identify and intervene on the most interconnected risks rather than the most obvious ones, since cascading effects often drive project outcomes.
-
-
If you work with the NorSand constitutive model, you know that calibrating elasticity and hardening parameters via visual fitting can be a tedious and subjective task. In our new paper, we propose a better way: a Bayesian approach that integrates experimental triaxial data with predefined priors to objectively estimate the most likely parameter values. This method removes the bias of manual fitting and quantifies the uncertainty in your parameters. We validated the methodology using Fraser River sand and have released the Python code so you can try it on your own datasets. 🔗 Paper: https://lnkd.in/eub3ShpC 🔗 GitHub Repository: https://lnkd.in/edshx9v9 #Geotechnics #ConstitutiveModeling #DataScience #Engineering #Mining Thanks to all the co-authors: Luis-Fernando, Humberto and Alexandra SRK Consulting Geosyntec Consultants Red Earth Engineering A Geosyntec Company The University of Western Australia
-
A recurring challenge across science & engineering: you need to align a computationally expensive black-box simulator (PDEs, etc.) to data in order to infer hidden parameters like material coefficients or boundary conditions. In many such cases, you don't have access to gradients, adjoints, etc. If you only want point estimates, then Bayesian optimisation (BO) is an option. But if you care about the full posterior distribution, Monte Carlo or MCMC quickly become infeasible. You could fall back on Laplace approximations, but for most PDE-based inverse problems the posteriors are horrible: multimodal, non-identifiable, with tangled geometries, reflecting sensitivity scales and invariances. ABC is an option: but this typically requires huge amounts of evaluations, and has a tendency to inflate posteriors. So the homework question was: just as BO uses Gaussian Process surrogates and acquisition strategies to explore costly functions, can we design sampling strategies the same way, to approximate a posterior under a fixed compute budget? With the brilliant Takuo Matsubara, Simon Cotter, and Konstantinos Zygalakis, we introduce Bandit Importance Sampling (BIS): • A new class of importance sampling that designs samples directly via multi-armed bandits. • Combines space-filling sequences (Halton, QMC) with GP surrogates to adaptively focus where evaluations matter most. • Comes with theoretical guarantees and works well on multimodal, heavy-tailed, and real-world Bayesian inference problems. Takeaway: BIS works well: it can cut evaluations by orders of magnitude. For problems with ~10–20 parameters, it’s a very viable option. Preprint here: https://lnkd.in/egrZX_NJ Next steps: packaging this up for the community.
-
Bayesian methods offer a powerful framework for improving decision-making by continuously updating probabilities as new evidence emerges. The Monty Hall problem provides a vivid illustration: while intuition suggests a 50/50 chance after one losing option is revealed, Bayesian analysis shows that switching doors doubles your odds of success—from one-third to two-thirds—by correctly integrating the new information. This same process of adaptive probability adjustment forms the core of Bayesian thinking: every new observation recalibrates your expectations, resulting in smarter choices grounded in data rather than gut feeling. In modern engineering and product reliability, Bayesian approaches thrive where data is limited or uncertain, allowing teams to start with estimates based on expert opinion or prior product generations. As new test results or field failures arrive, Bayesian models blend this fresh evidence with existing knowledge, producing more accurate, timely reliability predictions. Whether in early product launches, critical failure detection in complex systems, or accelerated life testing, the Bayesian mindset empowers organizations to make informed, defensible decisions that evolve dynamically as reality unfolds.
-
No Two Ways About It: Why Bayesian Thinking Is Non-Negotiable in Risk Management There are two ways to quantify risk: Frequentist methods derive probability from past frequencies; they work well when data are plentiful and conditions are stable, but they can be misleading when data are limited or the environment shifts. Bayesian methods view probability as your current, evidence-based belief, starting with a reasonable baseline from base rates and expert judgment, then updating it as new data arrives. How many risks in your portfolio have enough relevant data to quantify easily? Plenty for financial exposures, but far fewer for operational risks, and almost none for strategic risks. When data is limited, “data-only” estimates can be dangerously misleading, and the most critical risks may go unassessed. That’s why risk management must rely on Bayesian principles. Consider a first-of-its-kind risk, such as a regulatory change that could force your company to shut down. You face a choice: complain that this risk can’t be assessed due to missing data, or use Bayesian methods, which can start with expert judgment and relevant precedents from comparable jurisdictions. Suppose a 5-10% probability within 12 months and a P90 loss of $280-380 million. Formally, the 5-10% baseline is your prior; new information shifts it to a posterior of 20-30%, which you report as a range. Suddenly, a new public signal on “data sovereignty” arrives; your belief updates to 20-30% and P90 increases, breaching your risk appetite. Once lawmakers pause the proposal, the probability falls to 10-15%. There’s no frequentist approach here; only Bayesian estimation that updates with each new clue can reduce uncertainty. The same applies to strategic risks: For instance, initially, you have 3 options for entering the market: 1. acquiring a target, 2. forming a joint venture, or 3. building from scratch. Your initial estimate suggests a 20-30% probability you won’t reach break-even within 24 months, with a P90 loss of CHF 9–12 million. Early assessments rule out the acquisition. With one option eliminated, the probabilities for the remaining two options increase: the failure risk rises to 30-40%, and the P90 loss to CHF 12-16 million, exceeding your risk appetite (set at CHF 12 million). You don’t need more data: you can run a two-region joint venture pilot, add an exit clause, and invest CHF 1.2 million in targeted branding. A quarter later, the pilot performs well; the failure risk drops to 20-26%, and P90 loss to CHF 11-12 million, bringing it back within your risk appetite. Most critical risks are inherently Bayesian. If your risk report misses quantified risks due to insufficient data, replace them with “living probabilities” that learn from updated data. Without Bayesian thinking, risk management won’t be effective, and your risk portfolio will likely be incomplete. Institut für Finanzdienstleistungen Zug IFZ Lucerne University of Applied Sciences and Arts
-
DIBE learning time: Can we really predict the unpredictable on a construction site? What if subtle fluctuations in crane operation, worker behavior, or equipment setup could signal danger—before an accident happens? A new study in Developments in the Built Environment introduces a breakthrough real-time risk prediction framework for tower crane operations. By combining the Functional Resonance Analysis Method (FRAM) with Bayesian Networks (BN), the model maps how small performance variations interact and evolve into potential hazards. Tested on real construction data, the system revealed that even “low-risk” conditions can quickly drift toward danger, underscoring the need for continuous, data-driven monitoring. This hybrid FRAM-BN approach marks a step toward predictive safety management—helping site managers move from reacting to accidents to preventing them altogether. Curious how AI, simulation, and systems thinking are redefining safety on construction sites? Read the full paper to learn more.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development