Learning from Evaluation Results – 5 Simple Steps

Learning from Evaluation Results – 5 Simple Steps

Too often, evaluations are treated as a ‘slap stick’ exercise. Results may point to critical flaws in the program or intervention but there is resistance from project custodians who may not agree and have a different interpretation. In contrast to the old saying that “we learn from our mistakes”, fear of failure and entrenched views can stand in the way meaningful learning. But it’s a two way street, and both clients and service providers bear responsibility in the process.

When doing monitoring and evaluation work, we always need to keep in mind that once the evaluation has finished the work for the client is just about to start. We move from evaluation to implementation. Here follows some useful hints that can help.

  1. Preparation is everything. In high school our sports coach used to say: “practice don’t make perfect, perfect practice make perfect”. The same logic applies to evaluations. Invest time and effort up front with proper desk reviews and so forth. From the client side, ensure the project timeline allows for proper preparation. Cutting corners at this stage will only add time towards the end and results may not be in line with expectations.
  1. Hold a developmental workshop. If appropriate, bring project custodians and key stakeholders together in a workshop at the outset of the project. Use the workshop to share and discuss critical issues, gain input and acceptance of the methodology, and cover practical and logistical matters. More importantly, gain ‘buy-in’ from stakeholders by getting to know them and understand how they think and what their expectations are. With a bit of preparation, it is amazing how much can be achieved in a workshop like this.
  1. Identify critical performance areas. If preparation work is done correctly, you will be able to anticipate critical performance areas and can begin a discussion around these early in the process. Questions about tools to be used can then be formulated to explore these specifically. By bringing potential issues out in the open at an early stage people become sensitized to them and talking about them becomes less threatening.
  1. Look for balance in results. Even poorly executed programs and interventions will have some strong points. It is very important to balance critical improvement areas with areas of strength that can be built on in the future. Programs are similar to people, as humans we develop by harnessing our strengths and improving or compensating for our weaknesses. If a critical weakness or issue comes up it is a good practice to share it as soon as possible. This will allow people to comment and to contemplate the result and its implications before reading the final report.
  1. Make realistic and actionable recommendations. This is easier said than done but is the hallmark of good evaluations. When ‘lessons learned’ are confused with recommendations (this happens!), it is often a result of inadequate preparations and results become a ‘blur’ between what goes to the heart of the evaluation objectives and what resulted from unforeseen events.

By following these simple steps, by the time results are presented there should be few surprises and those involved will already be working in the spirit of collaboration. Even for rushed projects these steps can work. In such cases consider a few consultation interviews to compliment the desk review and, instead of a workshop, maybe do a couple of conference calls. Check out this award winning case study on Collaboration Learning and Adaption (CLA) based on a study done for USAID http://www.rapid-asia.com/rapid-asia-on-the-go/cla-case-studies-complex-problems-needs-simple-solutions-by-rapid-asia/

Connect With Rapid Asia l Facebook l LinkedIn l Twitter l

Great, easy overview. I particularly like your second and last points which I find all to often are not followed, even with the best intentions. It is perceived as easier to not involve too many people up front (developmental workshop) and evaluators often write up pie-in-the-sky recommendations...leaving implementers holding the bag.

Like
Reply

Hi Daniel Lindgren, good overview, I like simple evaluations, which help to avoid that we don't learn from our mistakes, which is often the case.

Like
Reply

To view or add a comment, sign in

More articles by Daniel Lindgren

  • Is Thailand Facing Another Migrant Exodus?

    Thailand has some 3–4 million migrant workers that fill the gap for much needed low-skilled labor and companies in…

  • Measuring Social Norms

    Programs or campaigns directed to change human behavior need to consider the potential influence of social norms. But…

  • Realizing Rights for Women Market Vendors

    Market vendors represent a large portion of small-scale traders in Lao PDR and women represent more than 90 percent of…

  • Why the Polls Got it Soooo Wrong

    The US Election is over and the result was shocking for many, and surprising to most. Secretary Hillary Clinton was a…

  • Human Rights Based Measurement

    There are growing concerns about human rights around the globe and use of the Human Rights Based Approach (HRBA) is…

    1 Comment
  • Strategy Framework for Policy Advocacy

    Gaining actionable insights to bolster policy advocacy can be difficult and available data is often fragmented. The…

  • Measuring Demand Reduction

    We are hearing more about demand reduction these days. Demand reduction applies in situations where we wish to undo…

  • Developing Indicators for Prevention Type Programs

    One of the more challenging tasks in the monitoring and evaluation world is the effective development of indicators…

  • How to Make KAP* Surveys Work? (*Knowledge, Attitude, and Practice)

    From time to time we hear organizations express dissatisfaction with KAP surveys. We often hear people saying they…

    2 Comments
  • Behavioral Avoidance, How do we Prove an ‘Unknown’?

    Many social development programs want to achieve behavior change. Yet, not all forms of behavior change are the same.

Others also viewed

Explore content categories