November is communication evaluation measurement month! AMEC – the International Association for Measurement and Evaluation of Communication and its members and partners are hosting many events – online and in-person globally – check out the calendar >>
I’ve put together this guide for anyone who wants to learn how communication can more effectively support evaluation: evaluation consultants, communication consultants, evaluation commissioners and programme/project staff participating in evaluations.
Join a 1 day workshop in Bern, Switzerland on the Politics of Evaluation taught by Dr. Marlène Läubli Loud.
The course will look at why and how evaluation is in itself a political activity and consequently, how the «political» interests of the various partners involved can play an important part in influencing the evaluation process. We will look at aspects of political influence and practice how those can be managed to avoid conflicts of interest and minimise risks.
Jim Coe and Rhonda Schlangen have published a very interesting publication on advocacy evaluation.
The highlight six factors that they believe should change for monitoring and evaluation of advocacy:
1. Better factor in uncertainty.
2. Plan for unpredictability.
3. Redefine contribution as combinational and dispositional.
4. Parse outcomes and their significance.
5. Break down barriers to engaging advocates in monitoring and evaluation.
6. Think differently about how we evaluate more transformational advocacy.
Here is an interesting tool to help with context analysis: ‘Context Matters’ framework – to support evidence-informed policy making. The tool is interactive and you can view the different elements from various perspectives. Designed to support the use of knowledge in policy-making, it could also be of interest to researchers and evaluators as an analytical tool for contexts.
View the interactive framework here>>
Read more about the framework here>>
Thanks to Better Evaluation for introducing this new resource to me.
I’ve written previously about using infographics to summarise evaluation findings; here is another recent example of where my evaluation team used an infographic to present the findings of an evaluation – it’s only a partial view – you can see the complete infographic on page 5 of this report (pdf).
My friends at TRAASS have launched a new e-learning course on Real-time evaluation and adaptive management:
“What exactly is an RTE/AM approach and how can it help in unstable or conflict affected situations? Do M&E practitioners need to ditch their standard approaches in jumping on this latest bandwagon? What can you do if there is no counterfactual or dataset? This modular course covers these challenges and more.”
For those interested in the area of humanitarian work and advocacy, this presentation could be of interest – where I explain what is humanitarian advocacy – its definition, levels, process and challenges.