Facing global development's fear of failure

The development community should undertake research to detect not just whether a particular intervention succeeded or failed, but why and how it did. Photo Credit: Jessica Scranton/FHI 360

One of the most promising trends in global development is the rising priority of understanding and investing in “what works.”

As the funds available for international assistance have flatlined in post-recession years, everyone from donors to practitioners has become increasingly committed to making decisions that are informed by evidence. The attention being paid to evidence-informed development is encouraging — yet the best-kept secret within the growing movement about what works, is the importance of learning not just from our successes, but also from our failures.

One can be forgiven for forming the impression that our development efforts are nearly perfect if typical annual reports, scientific conferences and event social media content are the basis for information. Successes are proudly packaged in glossy formats and heavily disseminated, whereas any objectives not achieved are relegated to the obligatory, and typically short, lessons learned section. This practice does not accurately represent an important reality: development efforts do in fact fail.

Venture capitalists and corporate investors understand that less than 20 percent of new businesses will succeed, and they invest in innovations and new ideas with a transparent acknowledgment of the high risk for failure. So why, by comparison, is the global development enterprise so different?

1. Efforts to address persistent and complex development issues — such as gender inequality or extreme poverty — are implemented in particularly challenging environments. Programs often operate in settings with high levels of corruption, conflict and poor infrastructure. For example, it is difficult to ensure that a specific innovation in health service delivery will work if the overall health system is dysfunctional. Rather than openly acknowledge the real risks this challenging context presents, though, there is a tendency to ignore or underestimate the magnitude of these constraints on our potential impact.

2. Development work is funded primarily through charitable giving and public monies. This funding environment brings an amplified pressure to deliver results and clear disincentives for open acknowledgment of ineffective strategies. Donors and taxpayers want to see the return on their investment. Powerful stories of lives being changed for the better will ensure that the generosity of funders continues; failing to achieve clear impact will endanger it.

3. Unlike many professionals in technology or engineering, people who work in development seem to have difficulty untangling the difference between when something fails, and when we fail. For example, software engineers are comfortable testing a computer program hundreds of times before finding one that works, without interpreting the process as a testament to their abilities. By contrast, many of us feel inadequate and personally responsible when a development initiative fails to produce the desired results.

The tide may be turning. In the past two years, many high-profile experts have issued pleas to the international community to improve our learning from failures. World Bank President Jim Kim, named it his “Big Idea for 2013.” Speaking at the 2012 Open Up! conference, Justine Greening, U.K. secretary of state for international development, said: “We need to be really honest with ourselves and others about why it [the development project] didn’t work. And we need to share those results, not hide them away.”

So what can the international development community do?

Stop taking it personally and share your experience. Contribute to Admitting Failure, a website where people post stories about development efforts not going as planned and why the work may have gone awry. Participate in “failure fests,” convened to encourage individuals to share examples of failure in a friendly, supportive and often humorous environment.

Resist the lessons learned syndrome and give your information the space it deserves. Follow the bold and inspiring example of Engineers Without Borders Canada, who publish a “Failure Report” each year to accompany their traditional annual report. Though it is unlikely that your organization has a similar report, at the project level you can prepare full reports about ineffective innovations and disseminate them as thoroughly as you would have if the innovation had worked. Importantly, despite the oft-cited publication bias toward positive research findings, journals are increasingly willing to publish “negative” results if they advance our collective knowledge on a topic. One example, is this paper on expanding contraceptive options in South Africa.

Undertake research to detect not just whether a particular intervention succeeded or failed, but why and how it did. Implementation science and impact evaluations can be powerful tools in this regard.

The stakes are high. If we continue skewing our knowledge sharing toward only successful endeavors, we risk repeating mistakes from the past and in so doing, wasting enormous sums of money. We owe it to the people we serve to take a page from Google’s playbook and “fail fast and learn fast.”

Join the Devex community and access more in-depth analysis, breaking news and business advice — and a host of other services — on international development, humanitarian aid and global health.

The views in this opinion piece do not necessarily reflect Devex's editorial views.

About the author

  • Tricia Petruney

    Tricia Petruney is a technical advisor for research utilization at FHI 360, where she supports governments, funders and implementers to improve evidence-informed decision making within global development policy and program design.