It was once enough to gauge the success of a research project — even those aiming to influence policy — by counting the number of articles published in peer-reviewed journals, and possibly citations or downloads.
But the scope and scale of policy research projects has broadened — moving away from single research studies toward multicomponent, multipartner and multisector endeavors. These projects often have numerous aims, beyond producing high-quality publications: to have impact beyond academia, engage different external stakeholders and/or build partners’ capacity, to name just a few examples.
M&E as we know it is blurring with research and learning to hatch a new acronym: MERL. The transition entails change management and comes with plenty of questions. In the meantime, here’s MERL as characterized by seven surprising similes.
Evaluating outputs is no longer enough to assess a project’s impact, since it only captures a small proportion of what these policy research projects aim to achieve.
Based on years of experience working with nongovernmental organizations, policymakers and other research institutes, and building on previous RAPID work,The Methods Lab has put together new guidance on how to design M&E frameworks for complicated and complex projects.
Here we share four key things to bear in mind as you embark on designing an M&E framework for your policy research project:
1. Go beyond outputs and uptake.
Outputs — and to some extent uptake — are usually well captured in research M&E frameworks. But thinking through diverse ways that uptake, outcomes and “impact beyond academia” can take place and how context can be monitored is often more challenging.
The framework RAPID has been using for almost a decade is based on the five-level “Ingie Framework” — with the addition of a sixth level, context — tracks changes more closely and pays attention to the often neglected elements of strategy and management.
Though overlooked, these two levels are hugely important and influential — especially in big, multiyear, multimillion research programs. Strategy and management are usually assessed intuitively by managers but it is worth systematically and regularly (e.g. annually) reflecting and assessing whether the project direction is still valid and whether applied decision-making and governance systems are functioning and fit-for-purpose.
2. Start with questions and prioritize.
As with any good evaluation plan, the design of an M&E framework should start with purposes and questions, not with methods, indicators or logframes. Beginning with indicators can lock you into what will be measured (indicators) before thinking through what you want to know (M&E questions) and why (purposes).
Questions have the power to help direct “sense-making” and can support program design. It is also better to prioritise and focus on one or two main M&E questions, and then support these with secondary questions that don’t all necessarily need to be answered every year or at each assessment point.
3. There’s no one-size-fits-all solution (sorry).
Though many of the policy research projects and programs we work with have similar characteristics, their nature and the context in which they operate in differ significantly. Even within RAPID we all use the M&E framework in diverse ways, highlighting different elements and steps.
Our guidance note doesn’t attempt to provide comprehensive instruction on all aspects of developing an M&E system — for example how to collect, manage, analyze and use data. Instead it should be viewed, not as set-in-stone, but as flexible guidance to start and structure thinking.
4. Be realistic with M&E plans and activities.
Ultimately, many of the choices about deciding the scope, intensity and timing of the M&E areas will largely depend on the resources available — personnel, time, funds, as well as capacity, experience and skills of those people dedicated to, and involved in, the M&E work.
It is better to be realistic and practical about what can be done and how much time people can truly spend on M&E activities, rather than trying to do everything possible but in a hasty or unsystematic manner.
Join the Devex community and access more in-depth analysis, breaking news and business advice — and a host of other services — on international development, humanitarian aid and global health.
Tiina Pasanen is a research officer in the research and policy in development program at the Overseas Development Institute. Tiina is an expert on monitoring and evaluation practices and methods, and impact evaluation approaches.
Subscribe to Devex Newswire
Top international development headlines emailed to you every day