When Data Lacks Context, Intelligence Fails

I’ve been thinking a lot about how often systems get the numbers right and the conclusions wrong.

For years, my electric utility has sent me emails comparing my energy usage to my neighbors.

On paper, the data looks precise. Charts. Benchmarks. A neat scorecard showing whether I’m “more efficient” or “less efficient” than the people around me.

But the conclusion is wrong.

My wife, my son, and I all live and work from home. We’ve invested heavily in automation, smart lighting, and efficient heating and cooling. We’re present in the house most of the day. We are not a family that leaves at 8 a.m. and returns at 6 p.m.

None of that context shows up in the model.

So every few months, I get a subtle message that feels like a scolding. You’re using more energy than your neighbors. You could do better.

The system doesn’t understand me. Over time, this lack of understanding has eroded the perception of this utility company. It makes me wonder how many more examples of this exist in other parts of their corporate decision making with data.

I’ve been seeing versions of this problem for more than 35 years.

As a researcher, and later as someone who spent decades inside large technology organizations, I’ve watched systems get better and better at measurement while staying surprisingly weak at understanding. We’ve gotten very good at collecting numbers, very good at optimizing around them, and far less thoughtful about what’s missing when the numbers are stripped of context.

That pattern is what ultimately led me to build what I now call a Qualitative Intelligence System.

Quantitative data is not wrong. It is incomplete on its own.

This same issue becomes far more consequential in higher-stakes environments.

Take rare disease and clinical research.

We collect enormous amounts of structured data. Lab results. Timelines. Enrollment numbers. Trial endpoints. However, the most important signals often live outside those fields.

Patient experience. Caregiver burden. Daily realities that shape adherence. The quiet reasons people hesitate, disengage, or never enroll at all.

When those qualitative factors are fragmented or ignored, trials struggle. Outcomes degrade. Promising therapies stall because the system never fully understood the people inside it.

What’s striking to me is that we’re now recreating the same mistake with AI itself.

Organizations are choosing a single model, a single system, a single “source of truth,” even though every model is trained by humans, shaped by assumptions, and bounded by its own blind spots. We talk about confidence and capability, but rarely about triangulation, iteration, or human validation.

In practice, that means we’re scaling incomplete perspectives faster than ever.

The work I care about now sits at that intersection.

Bringing quantitative and qualitative data together. Using multiple models in dialogue rather than isolation. Keeping humans in the loop, not as a formality, but as a safeguard. Preserving context and memory so decisions can be understood, revisited, and trusted over time.

Accuracy without understanding looks impressive, but it’s brittle. Optimization without context erodes trust. Systems that don’t reflect lived reality eventually stop being believed.

If technology is going to shape our decisions, our health, and our future, it has to do more than count. It has to understand.

“as someone who spent decades inside large technology organizations, I’ve watched systems get better and better at measurement while staying surprisingly weak at understanding” 💯

To view or add a comment, sign in

More articles by Marc Bulandr

  • The Signals Were There

    A Day Two reflection from the European Pulmonary Fibrosis Summit Brussels, Saturday, April 25, 2026. The 4th European…

    2 Comments
  • Three Questions from the Porch

    Day 1 of the 4th European Pulmonary Fibrosis Patient Summit, watched from the Driftless. By Marc Bulandr, Founder…

    2 Comments
  • The Experiment That Was Not Run

    MIT proved something important last February. Even a perfectly rational person is vulnerable to false beliefs through…

  • We Capture the Why

    A patient with pulmonary fibrosis stops wearing oxygen in public. The trial records a compliance drop.

    4 Comments
  • When Process Matters More Than The Patient

    Over the past year, I’ve been navigating progressive carpal tunnel syndrome in my dominant hand. This isn’t a story…

    6 Comments
  • Systemic Indifference Is an Engineering Problem

    Why we filed the patent on the Missing Memory Layer for AI By Marc Bulandr Founder, Qualitative Intelligence Systems™…

    15 Comments
  • Why I Am Co-Authoring My Mind: The 35-Year Path to Systemic Humanism

    I have been keeping a secret for 35 years. In 1992, as a graduate student researcher, I spent over 100 hours riding in…

    3 Comments
  • The Practice of Understanding - Part 2

    ..

  • The False Promise of Fluency - Part 1

    ..

    6 Comments
  • The Practice of Understanding - Preface

    A seven-part reflection on faith, technology, and the cost of truth. I’ve walked out of boardrooms that protected…

    8 Comments

Others also viewed

Explore content categories