New e-learning course: Real-time evaluation and adaptive management

My friends at TRAASS have launched a new e-learning course on Real-time evaluation and adaptive management:

“What exactly is an RTE/AM approach and how can it help in unstable or conflict affected situations? Do M&E practitioners need to ditch their standard approaches in jumping on this latest bandwagon? What can you do if there is no counterfactual or dataset? This modular course covers these challenges and more.”

Learn more about the course>>

Read More

Humanitarian advocacy – an introduction

For those interested in the area of humanitarian work and advocacy, this presentation could be of interest – where I explain what is humanitarian advocacy – its definition, levels, process and challenges.

Originally presented at CERAH as part of their Masters in Humanitarian Action. 

Read More

November is communication measurement month!

November is AMEC’s communication measurement month. There are some great events going on all over the world, check out the calendar of events >>

AMEC-Measurement-Month-2018.jpg

Read More

Example: mixed methods in evaluation

We often talk about using mixed methods in evaluation but we rarely see examples that go beyond a combination of surveys and interviews. So I wanted to share an example of an evaluation that I thought was a good example of using a variety of methods.  I was part of a team (of the Independent Evaluation Office) that carried out an evaluation of knowledge management at the Global Environment Facility.

The methods we used included:
-Semi-structured interviews
-Online surveys
-Comparative study of four organisations
-Meta-analysis of country-level evaluations
-Citation analysis – qualitative and quantitative
citation_analysis_small
The image shows the visualisation of the citation analysis (carried out by Matteo Borzoni) by theme – interesting stuff! I feel that the range of data collected gave us a very solid evidence base for the findings.  The report is available publicly and can be viewed here (pdf)>>

Read More

Practical lessons on influencing policy

papA recent issue of Policy and Politics  journal has a special focus on influencing policy, mostly about the work of research and academia in this respect. There are many parallels to advocacy and policy influence work in general, with this particular lesson I found highly relevant:

“Avoid relying too much only on evidence and analyses, instead combine evidence with framing strategies and storytelling”

The introduction chapter to the issue is free to access and can be viewed here>>

 

Read More

Tips for young / emerging evaluators

tips The Evaluation for Development blog from Zenda Ofir has been collating tips for young / emerging evaluators – that even experienced evaluators will find interesting. Here are some highlights:
From Zenda herself:
Top Tip 1. Open your mind. Read
Top Tip 2. Be mindful and explicit about what frames and shapes your evaluative judgments.
Top Tip 3. Be open to what constitutes “credible evidence”.
Top Tip 4. Focus a good part of your evaluative activities on “understanding”.
Top Tip 5. Be or become a systems thinker who can also deal with some complexity concepts.
Read more about these tips>>

From Juha Uitto:
Top Tip 1. Think beyond individual interventions and their objectives.
Top Tip 2. Understand, deal with and assess choices and trade-offs made or that should have been made.
Top Tip 3. Methods should not drive evaluations.
Top Tip 4. Think about our interconnected world, and implore others to do the same.
Read more about these tips>>

From Benita Williams:
Top Tip 1. The cruel tyranny of deadlines.
Top Tip 2. Paralysis from juggling competing priorities.
Top Tip 3. Annoyance when you are the messenger who gets shot at
Top Tip 4. Working with an evaluand that affects you emotionally
Top Tip 5. Feeling rejected if you do not land an assignment
Top Tip 6. Feeling demoralized when you work with people who do not understand evaluation
Top Tip 7. Feeling discouraged because of wasted blood sweat and tears
Top Tip 8. Feeling lazy if you try to maintain work-life balance when other consultants seem to work 24/7
Top Tip 9. Feeling overwhelmed by all of the skills and knowledge you should have
Read more about these tips>>

And from Michael Quinn Patton, just one tip:
Top tip 1: Steep yourself in the classics.
Read more about this tip>>

 

Read More

Networking mapping as an evaluation tool

I’ve posted previously about network mapping as an evaluation tool and recently I had the opportunity to use network mapping for an evaluation.

In an evaluation of the Shifting the Power project we were interested to see how local networks of NGOs had grown over the three years of the project. We were lucky that the project had carried out a mapping of NGO networks at the start of the project in 2015 and we then did the same in early 2018; here you can see the results comparing 2015 to 2018 from Bangladesh – interesting data!
network_STP.png

 

You can view the full evaluation report here (pdf)>>

 

 

Read More

Infographics to present evaluation findings

I’ve posted previously about using infographics to summarise evaluation findings; here is a recent example of using an infographic to present research results (click on the image to see larger and complete version); admittedly we packed a lot into this infographic – but still a good summary!
surge - final

Read More

Using Sankey diagrams for data presentation

I’ve always found the Sankey diagram an illustrative way to show transfers from inputs to outputs but have never found a use for them in my own work until now…

The following Sankey diagram shows research reports on crises on the left and the number of challenges identified (for humanitarian surge response) per crisis. The right shows the categories used to group the challenges (“Resource gaps, Policies and systems”, etc). This provides a visual overview of the challenges identified and their volume by crisis and type of challenge.

sankey_digram

I produced this diagram using a free online tool. If interested in the research reports, they can be found here.

 

 

Read More

The global rise and future challenges for evaluation

Very interesting article from the latest edition of the newsletter of the European Evaluation Society “Evaluation between evidence‑based policy and “fake news”: paths to the future” (p. 3).

The article presents the reasons why there has been a global rise in evaluations and four future challenges, summarised here:

Why the global rise of evaluation:
1) In many countries, evaluation is a fixed element in policy-shaping and a management control element
2) The number of national evaluation societies has grown
3) Market for evaluation is continuing to grow
4) The dissemination of evaluation findings has surged
5) Training activities have increased

Future challenges for evaluation:
1) Increasing importance of global issues
2) The challenge of populist movements and “fake news” to evidence-based policy
3) Need for reflexive systems-thinking approaches
4) Increasing demand for participatory evaluation

View the full article (p.3)>>

Read More