Large-scale investment by the United Kingdom Department for International Development in research and evaluation — over £1.2 billion ($2 billion) between 2011 and 2015 — produces a wealth of information.
But according to an independent review by the U.K. aid watchdog, the agency is not using this data sufficiently to learn and adapt.
The Independent Commission for Aid Impact’s report “How DfID learns” released last week looked at how far the donor “gains and uses knowledge to influence its policy, strategy, plans and actions.” Despite some good practices in learning, the review found that these are not consistently applied.
“When they do it well, they do it really well,” said ICAI’s Diana Good, who led the report. “They just don’t do it well often enough.”
That gave DfID a low overall score for learning — “amber-red” in ICAI’s ranking system, meaning significant room for improvement — a blow to an organization that prides itself on effectiveness and often considered to be among the more responsive of donors.
More pressure, more fear of failure?
Back in 2012, U.K. Secretary of State for International Development Justine Greening first announced her intention to champion evidence-driven development: “I want to make sure that we invest in what works. Where we don't know, I want to find out. It will make sure we are clearer about where we should focus our resources.”
Greening subsequently set about restructuring her department’s research and evidence division to drive that value for money agenda. Alongside making results accessible and increasing professional skills, this division commissions research both to develop new technologies and products, and to better understand the context in which DfID operates and how interventions could work better. Findings are publicly accessible on a platform that now holds some 40,000 records.
However, more information may not mean more learning: in an internal survey by DfID last year, just over half of staff members thought lesson learning in their area was "average," while 20 percent of respondents thought DfID did this "badly or very badly."
Being able to use information, build on experience, and turn it into concrete action is vital right now for the U.K. as a donor, given its commitment to spend 30 percent of aid in fragile and conflict-affected states, where circumstances change rapidly, and given DfID’s steadily increasing budget and staff numbers.
Earlier this month, the U.K. confirmed it had met its aid volume target, spending 0.72 percent of GNI in 2013 — reviving criticism in some quarters on the decision to ringfence the aid budget amid budget cuts in other government departments. That spending power — DfID plans to disburse £8.9 billion in 2013-2014 — means more public scrutiny and ratchets up the pressure to prove money is being well-spent.
Yet precisely the factors that make learning all the more important may also be preventing learning taking place.
DfID staff told the British aid watchdog that "organizational learning is not always seen as a priority objective … given the pressure to deliver results and spend the budget." The report sounded the alarm on a “culture where staff have often felt afraid to discuss failure,” and one head of department even told the authors that the agency "does not fail."
Most worryingly, the review found that staff sometimes use selective evidence “to justify spending or support political priorities.” Though not a common practice, according to ICAI, it does happen “with sufficient regularity to be a concern. It is clearly unacceptable.”
Some cases of openness stand out, however. ICAI heard of a video produced by DfID staff in the Democratic Republic of the Congo that brought to light the failings of a water supply project. Despite some initial reservations, the video had been "catalytic in stimulating discussion about how DfID should be more honest about failure.”
But for every positive example, the watchdog found others of lessons ignored or ineffective approaches continuing. Best practices need to become standard, and not reliant on individual initiatives, said the watchdog.
“DfID could be a world leader in [learning]”, said Good. “But we haven’t got there yet.”
It is not the first time that DfID has been urged to improve in this area. An ICAI review last year of the donor’s £350 million spend on agricultural research found it was not sufficiently linking research with its country programs, nor was it doing enough to get results into the hands of farmers — the intended beneficiaries.
Since then, the agency has created “learning partnerships” within its program partnership arrangements with civil society partners, which Good considers an example of how DfID can maintain oversight and get better results, even when far-removed from delivery on the ground. But it needs to engage even more during implementation, she said, and overcome an “imbalance” that sees more energy going into the beginning and end phases of projects.
For an organization whose programs are largely delivered by others, that recommendation is “really important,” said Harry Jones, research fellow at ODI and co-author of its 2010 report.
DfID has a tendency is to be "a bit too hands-off," Jones added.
Employees, though, were found to be “highly motivated”, and individual learning is encouraged.
Indeed, according to a 2013 staff survey, DfID was the highest-performing among U.K. government civil service departments for personal learning and development. But staff members need more time to both acquire and share experience, concluded ICAI. And with most research commissioned to third party organizations, DfID teams are “consumers of knowledge products” rather than producers of knowledge. The risk, the aid watchdog claims, is that they lack the practical experience to use this knowledge to make sound decisions.
“Individuals learn best from real experience or talking to others who have real experience,” said Good. “People we spoke to refer back to their time in field — it made an enormous difference, not only … at the time but also to how they have designed programs since.”
An entire HIV and AIDS intervention was redesigned, for example, not based on evaluation reports, but after an adviser had spent time in a village talking to the young people it was meant to help.
Such responsiveness, however, is not yet the rule.
“Staff believe that DfID remains too much in a mode of trying to manage or change others, rather than listen to and support them,” says the report — while in 5 of the 12 case studies examined, the agency was not learning from those it was working to help.
Considering the evidence
DfID spending on research has increased almost every year since 2005, and ICAI is concerned “that DfID may not be targeting its research efforts sufficiently on its key priorities” and that valuable research gets lost among the volume produced.
As of July 2013, 425 evaluations were either planned or underway — however, said Good, the solution is not necessarily to reduce the number of reports, but make sure they can be used. “Don’t spend all that [money] unless you are going to learn from it,” she argued.
This echoes a staff survey undertaken by DfID last year, which found that “the biggest barriers to using evidence are easily finding it and having enough time to consider it.”
Reports should be more clearly synthesized, says ICAI, while internal information systems must be more user-friendly — rather than sending staff to search on Google first as is often the case.
“We are not saying DfID should create new bureaucratic processes,” said Good.
On the contrary, they should be “stripping out the enormous amount of indigestible information in its systems.”
Indeed, the complexity of the development industry makes many of our usual tools and systems inappropriate, believes Jones — including how we share knowledge for decision-making. This could be done in a more direct, simpler way, he said, such as making professional networking opportunities more systematic, and facilitating discussion and conversations — a “low-cost but potentially high-impact” way to improve learning.
Learning from one another will only become more urgent if current DfID hiring trends continue: the number of full-time equivalent staff increased by nearly 10 percent in the year to September 2013, with half based abroad.
Staff turnover is “both a constant gain and loss of knowledge,” says ICAI, with fragile states particularly vulnerable: in Afghanistan, for example, half of DfID staff are replaced every year, while one project in the DRC reportedly had 5 managers in 5 years.
A certain amount of failure is to be expected, says ICAI, and DfID should be “both taking risks and innovating.”
To some extent, this is already happening, with evidence of a more open attitude since late 2012. The ICAI investigation was based in part on DfID’s own studies on the use of evidence. A 22-person "Evidence into Action" team has been set up to make knowledge more accessible, for instance by organizing so-called "fail fairs," where staff can come together to identify what can be improved.
These internal discussions mirror similar, public events, such as the Fail Festivals that embrace an uncharacteristically honest and informal discussion on what goes wrong in development work.
But while such events have attracted corporations, nonprofits, and United Nations agencies including the International Fund for Agriculture Development, “celebrating” failure may be a step too far for governments.
With U.K. parties already starting to manoeuvre ahead of general elections in May next year, taking on more risk and talking to voters about what has not worked may not be a viable option for DfID right now. “The idea of discussing failure would be a dangerous one for any government ministry,” said Jones.
The amber-red score may be overcritical, said Jones — after all, DfID is probably doing better than most donors.
It was “really impressive,” he told Devex, that the report had even been published. The creation of ICAI itself in 2011 reflects DfID’s openness to criticism. Indeed, alongside the World Bank — which answers to its Independent Evaluation Group — it is among the few donors brave enough “to air their dirty laundry,” said Green.
But this does not exempt DfID from doing better, believes ICAI. Their recommendations are not over-demanding, said Good — not least because DfID is well capable of implementing them. “We’ve already seen they can do it,” she said.
ICAI’s recommendations include:
1. Focus on consistent and continuous organizational learning based on DfID’s own experience and that of its partners and contractors, in particular during implementation.
2. Hold DfID managers accountable for reviewing what works and where impact is actually being achieved.
3. Synthesize all information commissioned and collected so that relevant lessons are accessible and readily useable. Know-how should be valued as much as knowledge.
4. Give staff more time to acquire experience in the field and share lessons about what works and does not work on the ground.
5. Continue to encourage a culture of communication about what does and does not work. Staff should be encouraged always to base their decisions on evidence.
Responding to the review, a spokesperson from the agency said: “DfID delivers targeted results, maintains value for money and is recognized as one of the most transparent donors in the world. We continually look at our processes in light of external evaluations like ICAI. We have already introduced strong measures such as greater ministerial oversight of spend and tough new rules for suppliers to drive better value for money for taxpayers.”
A detailed response to the ICAI report is expected by end of April.
Read more development aid news online, and subscribe to The Development Newswire to receive top international development headlines from the world’s leading donors, news sources and opinion leaders — emailed to you FREE every business day.