The Wrong Question Can Be More Expensive Than the Wrong Method

The Wrong Question Can Be More Expensive Than the Wrong Method

Last year, a programme team called me in a mild panic. They were preparing to commission an external evaluation.The budget was approved. The timeline was fixed. The donor expected it.

But something felt off.

The programme had stabilised only six months earlier after a rocky start. Partners were still adjusting delivery. Outcome data was thin. Behaviour change wasn’t observable yet.

Still, it was “mid-term.”

So the draft Terms of Reference (TOR) asked:

“To what extent has the programme achieved its intended outcomes?”

That’s a clean evaluation question.

It was also the wrong one.

What Happens When the Question Is Wrong

When you ask the wrong question, the method won’t save you.

You can hire the best evaluator. Use the most rigorous methodology. Triangulate data beautifully.

If the programme isn’t ready to answer “Did it work?”, you’ll get:

• Cautious conclusions

• Thin outcome findings

• Output-heavy reporting

• Recommendations that feel obvious

And then you sit in a meeting defending results that were never ready to show up.

The damage isn’t technical.

It’s reputational.

The Real Problem Wasn’t Evaluation

The problem wasn’t that they chose evaluation. It was that they hadn’t clarified what decision the review was meant to inform.

Was the real decision:

• Whether to scale?

• Whether to terminate?

• Whether to renew funding?

Or was it:

• How to stabilise delivery?

• Where implementation bottlenecks were?

• Which assumptions were holding, and which weren’t?

Those are very different questions.

An evaluation answers judgement.

A learning review answers adaptation.

If you commission judgment when what you need is learning, you force the programme into a premature verdict.


Three Questions to Ask Before You Commission Anything

If you’re feeling pressure to “just evaluate,” pause and ask:

  1. What decision will this review directly inform? If you can’t name the decision, you’re commissioning activity.
  2. Is outcome-level change realistically observable yet? If not, an evaluation will default to counting activities and outputs. In other words, counting how many people you trained and not whether they applied what they learnt to improve their lives.
  3. Will this review strengthen the programme, or simply assess it? Timing matters more than method.

These aren’t academic distinctions.

They determine whether your review builds credibility or erodes it.

The Hidden Cost (and it is more than Money)

A poorly timed evaluation doesn’t just waste budget.

It:

• Signals instability to donors

• Locks in weak baseline narratives

• Creates pressure to over-claim

• Makes future reviews harder to defend

In sum, the wrong question creates downstream consequences.

And once it’s written into the Terms of Reference (TOR), everything follows from it.

Most evaluation mistakes happen before the TOR is drafted.

They happen in that quiet moment where someone says:

“It’s that time in the calendar.”

And no one asks:

“Is this the right question for where we are?”

The method is rarely the real risk.

The framing is.


If Monitoring, Evaluation and Learning (MEL) decision is sitting on your desk right now and you’re not fully confident about the question you’re about to commission, that’s not a training gap.

That’s a judgment call.

And those are the moments where getting it right matters most.

Get a documented expert advice now!

Applications for the current intake are open. Visit here to learn more.

I guess it's also reflective of inherent weak results monitoring systems unable to generate meaningful and informative data that can influence evaluation questions and timing

This reflects a common tension between compliance and learning during project implementation. There is no one best way. Projects must balance timelines, contracts, and stakeholder expectations. Delays are not the only concern. The project lead should facilitate an open team/stakeholders discussion on challenges and realistic solutions, then decide whether to prioritise learning during midterm (even without immediate results), postpone activities within donor rules, or cancel it.

Like
Reply

Tagging Melissa Paluch, M.A.

When I start an evaluation, I always ask, "What do you want to find out from this evaluation, that you don't know already?" And I almost never get an answer. That tells you something.

To view or add a comment, sign in

More articles by Ann-Murray Brown🇯🇲🇳🇱

Others also viewed

Explore content categories