• News
    • Latest news
    • News search
    • Health
    • Finance
    • Food
    • Career news
    • Content series
    • Try Devex Pro
  • Jobs
    • Job search
    • Post a job
    • Employer search
    • CV Writing
    • Upcoming career events
    • Try Career Account
  • Funding
    • Funding search
    • Funding news
  • Talent
    • Candidate search
    • Devex Talent Solutions
  • Events
    • Upcoming and past events
    • Partner on an event
  • Post a job
  • About
      • About us
      • Membership
      • Newsletters
      • Advertising partnerships
      • Devex Talent Solutions
      • Contact us
Join DevexSign in
Join DevexSign in

News

  • Latest news
  • News search
  • Health
  • Finance
  • Food
  • Career news
  • Content series
  • Try Devex Pro

Jobs

  • Job search
  • Post a job
  • Employer search
  • CV Writing
  • Upcoming career events
  • Try Career Account

Funding

  • Funding search
  • Funding news

Talent

  • Candidate search
  • Devex Talent Solutions

Events

  • Upcoming and past events
  • Partner on an event
Post a job

About

  • About us
  • Membership
  • Newsletters
  • Advertising partnerships
  • Devex Talent Solutions
  • Contact us
  • My Devex
  • Update my profile % complete
  • Account & privacy settings
  • My saved jobs
  • Manage newsletters
  • Support
  • Sign out
Latest newsNews searchHealthFinanceFoodCareer newsContent seriesTry Devex Pro
    • Opinion
    • Monitoring and evaluation

    Do your M&E systems stack up to disco-era standards?

    When talking monitoring and evaluation, it's hard to ignore the fact that we've made little progress over the past 40 years. International Solutions Group's Michael Klein looks at how to better integrate the systems and processes into our organizations.

    By Michael Klein // 04 October 2016
    Akabondo Mainais (left), a monitoring and evaluation officer for Climate Investment Fund projects in western Zambia talks to members of the Nalunau family. Photo by: CIF / CC BY-NC-ND

    When talking monitoring and evaluation, it’s hard to ignore the fact that we’ve made little progress over the past 40 years. Indeed, We’re still trying to catch up with the original pioneers in the field.

    Imagine where we could be today if our industry had adopted the following sage advice when it was first issued, almost half a century ago: “Build an information system into [your] project so that the necessary data [can] be collected in the course of regular project operations. Such a system can provide timely, relevant information that can be used by decision-makers throughout the course of the project.”

    This quote is pulled from a 1979 report authored for the U.S. Agency for International Development titled “The Logical Framework.” It draws on research — begun in the 1960s by Lawrence Posner and Leon Rosenberg — on the use of logical frameworks for program management, a framework widely employed in development today. However, that’s about the extent to which we’ve adopted the recommendations outlined in the authors’ research, which include taking a holistic approach to collecting and using data.

    The 1979 report includes observations and recommendations that we still struggle with today. As an example, below are key takeaways regarding problems with USAID’s project monitoring and evaluation system as of 1969 (taken verbatim from “The Logical Framework”):

    1. Planning.

     Objectives were multiple and not clearly related to project activities. There was no clear picture of what the project would look like if it were successful. Thus, evaluators could not compare — in an objective manner — what was planned with what actually happened.

    2. Management responsibility was unclear.

    Project managers were committed to the fact that projects must be justified in terms of their ultimate late benefits (“impact”) yet were reluctant to be considered responsible for impact; there were too many important factors outside their control. They found it difficult to articulate what they should be responsible for, and ended up not accepting any responsibility for results.

    3. Evaluation was an adversary process. 

    With the absence of clear targets and frequent disagreements (even among project team members) as to just what the project was about, evaluators ended up using their own judgment as to what they thought were “good things” and “bad things.” The subsequent evaluation results would then frequently become a basis for further argument about what was good or bad, rather than resulting in constructive actions for project improvement.

    Does any of this sound eerily familiar?

    Sitting at my desk in 2016, I am awestruck by how pressing these problems continue to be — and how relevant a paper published in 1979 still appears. How do we go about fulfilling this vision for a more holistic approach to M&E?

    The good news is that we have made progress, albeit in a piecemeal fashion. There are a bevy of tools available for organizations to address the issues highlighted by Posner and Rosenberg — not just tools for helping logically map a particular intervention but also to put in place the type of “information system” that the authors first recommended during the Nixon administration. Better yet, these tools no longer require the use of punch cards for data entry and are increasingly accessible to the non-techies among us.

    The bad news is that development organizations still face many of the organizational challenges the authors discussed. So while we’re making great strides toward better M&E and the development of systems to support our work, there’s still much work to be done in integrating  those systems and processes into our organizations.

    How do we bridge the gap?

    The first step is instilling an appreciation for data and results-based management into our

    organizations. Thankfully, I believe we’re well on our way in this regard (albeit a few decades late), and the movement toward better data and information systems is inevitable. Living in the information age has forced our hand; there’s no turning back.  

    Next, we need to ensure the data we’re collecting is used and useful. This requires acknowledging that we’re past the point of talking about technology as an optional extra for organizations. This holds true for the full spectrum of development actors — from large U.N. agencies to small NGOs. Again, stealing from Posner and Rosenberg, we need to be investing in information systems “[that] provide timely, relevant information that can be used by decision-makers throughout the course of [our] projects”.

    There’s no shortage of resources to assist in fulfilling this long-unachieved vision. Tools include online classes (such as those offered by TechChange), conferences (MERL Tech and ICT4D) and guides and discussion papers on the issue (NetHope’s Back-Office IT Guide, Bond’s Investing in MEL Guide, and Rockefeller’s Monitoring and Evaluation in a Tech-Enabled World).

    How you chose to engage is up to you. That said, since they are approaching their ruby anniversary, I think we can all agree it’s time to bring Posner and Rosenberg’s recommendations to light and get serious about past failings and our future success.

    Join the Devex community and access more in-depth analysis, breaking news and business advice — and a host of other services — on international development, humanitarian aid and global health.

    • Institutional Development
    • Project Management
    • Worldwide
    Printing articles to share with others is a breach of our terms and conditions and copyright policy. Please use the sharing options on the left side of the article. Devex Pro members may share up to 10 articles per month using the Pro share tool ( ).
    The views in this opinion piece do not necessarily reflect Devex's editorial views.

    About the author

    • Michael Klein

      Michael Klein

      Michael Klein is a director of International Solutions Group, a company that works with governments, U.N. agencies, international organizations, NGOs and other companies to improve the implementation of humanitarian aid and development programming projects. Klein is based in Washington, D.C, and is a member of ALNAP, Washington Evaluators and the American Evaluation Association.

    Search for articles

    Related Stories

    Sponsored by AmgenOpinion: Are health systems really measuring what matters?

    Opinion: Are health systems really measuring what matters?

    World Bank Spring MeetingsThe World Bank is focused on jobs. What does that mean?

    The World Bank is focused on jobs. What does that mean?

    Food SystemsAfter decades of progress, USAID cuts could blind the world to famine

    After decades of progress, USAID cuts could blind the world to famine

    Devex Money MattersMoney Matters: Who was worst hit by USAID terminations?

    Money Matters: Who was worst hit by USAID terminations?

    Most Read

    • 1
      How low-emissions livestock are transforming dairy farming in Africa
    • 2
      Opinion: Mobile credit, savings, and insurance can drive financial health
    • 3
      The UN's changing of the guard
    • 4
      Opinion: India’s bold leadership in turning the tide for TB
    • 5
      USAID's humanitarian bureau is under pressure and overstretched
    • News
    • Jobs
    • Funding
    • Talent
    • Events

    Devex is the media platform for the global development community.

    A social enterprise, we connect and inform over 1.3 million development, health, humanitarian, and sustainability professionals through news, business intelligence, and funding & career opportunities so you can do more good for more people. We invite you to join us.

    • About us
    • Membership
    • Newsletters
    • Advertising partnerships
    • Devex Talent Solutions
    • Post a job
    • Careers at Devex
    • Contact us
    © Copyright 2000 - 2025 Devex|User Agreement|Privacy Statement