Linear Thinking in Complex Systems When Cause and Effect Break Down
In boardrooms and transformation programs, a familiar logic prevails:
This linear logic is one of the most powerful habits in organizational life. It is also one of the most dangerous.
In simple or complicated systems, linearity works. In complex systems, it fails—often quietly at first, then dramatically.
This article explores why.
1. The Appeal of Linearity
Linear models dominate management practice because they provide psychological and operational clarity:
They allow leaders to state with confidence: “If we do X, then Y will happen.”
This logic underpins:
For example, a cost-reduction program might assume: “Reducing headcount by 10% will reduce operating costs by 10%.” In a stable production line, this may hold. In an adaptive organization, the effect may be: Reduced informal coordination, Loss of tacit knowledge, Increased error rates, Decline in engagement, Unexpected turnover. It is the challenge when people are perceived as cost lines.
The apparent linear intervention triggers non-linear consequences. As Peter Senge argues in The Fifth Discipline, organizations are systems of interdependent elements. Change one part, and you influence the whole—often in unintended ways.
2. Complexity Is Not Complication
A critical distinction must be made between complicated and complex systems.
In such systems:
For instance, a company introducing agile methodologies may expect improved responsiveness. Instead, it may experience confusion, role conflict, and decision bottlenecks—because the incentive structure remains aligned with hierarchical control. The issue is not poor execution. It is systemic misalignment.
3. The Fallacy of Root Causes
Linear change models search for the root cause - and as a Lean Practitioner, it remains one of my habit.
Low performance? → “Lack of accountability” → Introduce stricter performance metrics.
Missed deadlines? → “Poor planning”→ Implement tighter controls.
But in complex systems, problems rarely have a single cause. They emerge from interactions.
Consider the case of the Boeing 737 MAX crisis. Investigations revealed not one isolated cause, but interacting factors: Design trade-offs, Regulatory dynamics, Organizational pressure, Communication breakdowns, etc. What appeared to be a technical issue was deeply systemic and will not be aleviated with a single counter-measure or a new KPI. In safety, we call it "aligning the cheese holes" as a single brakdown would have probably not caused this crisis.
Focusing on isolated causes often leaves the generative structure untouched. The symptom shifts. The pattern persists.
This is what systems thinkers call structural causality—the idea that behavior emerges from system structure, not isolated events.
Recommended by LinkedIn
4. Delays, Feedback, and Surprise
Linear thinking struggles with time.
In complex systems:
A classic example comes from Jay Forrester’s work on industrial systems. Efforts to increase production capacity often resulted in oscillations—overexpansion followed by contraction—due to delayed feedback in supply and demand signals. You can notice it at market level : the famous "cycles", in petrochemicals for instance.
In organizations, a new incentive scheme may boost short-term results. Then, six months later, collaboration deteriorates. Decision-makers then reinforce the intervention because early indicators looked positive. What appears as success may be the first phase of a counteracting loop.
Surprise is not a failure of planning ; it is a natural property of complex systems.
5. Predictability as a Mirage
Change initiatives often promise:
Such precision is often necessary to secure approval - and hence, budget.
But the more complex the system, the less meaningful precise prediction becomes. The 2008 financial crisis illustrated this dramatically. Risk models assumed linear relationships and stable correlations. When market dynamics shifted, feedback loops amplified instability beyond modeled expectations. As argued in The Black Swan by Nassim Nicholas Taleb (a great book I recommend), systems with hidden interdependencies are vulnerable to rare, high-impact events that linear extrapolation fails to anticipate.
Precision in planning can conceal fragility in structure. Systemic change requires shifting from predicting outcomes to designing for adaptability - or even better, Robustness as would claim Olivier Hamant.
6. From Linear Plans to Systemic Hypotheses
The alternative to linear planning is not chaos. It is disciplined inquiry. Instead of asking: “What is the solution?”
Systemic change asks:
This reframes change from implementation to experimentation. These are also the foundations of Agile methodology - at the condition you give it enough time to the delay feedback to appear. Technology companies have internalized this logic through iterative deployment and feedback loops. Rather than committing to large irreversible plans, they run controlled experiments, observe system responses, and adapt.
This is not less rigorous. It is more aligned with the nature of complex systems. But it requires to stick to a vision and let go a part of the predictability.
Conclusion: Replacing Certainty with Learning
Linear thinking persists because it simplifies decision-making in uncertain environments. It offers the comfort of control in uncertain environments and it works for simple or complicated systems. But when applied to complex systems, it produces confidence without understanding.
Systems thinking does not eliminate uncertainty. It replaces false certainty with structured learning. And the managerial shift is profound:
In complex organizations, cause and effect do not disappear. They become nonlinear, delayed, and intertwined.
The question is no longer whether we plan. It is whether we plan as if the system were simple—or as if it were alive.
In the next article, I will explore the myth around best practices in complex systems.