
Monitoring and evaluation allows donor agencies to measure the success of projects or know why they failed. For the Millennium Challenge Corp., this twin process is not one to be taken lightly.
As senior director for monitoring, evaluation and economics, Shiranthi "Shiro" Gnanaselvam oversees the agency's rigid M&E plan. Her team works closely with in-country M&E groups from planning to implementation.
Gnanaselvam has more than 14 years of experience in development, primarily in sub-Saharan Africa. She previously served as public sector management specialist at the World Bank and as a private-sector management consultant.
Gnanaselvam spoke with Devex on possible changes to MCC's monitoring and evaluation strategies as well as the distinct features of the agency's M&E policy.
Can you give us some examples for changes to MCC monitoring and evaluation strategies that are currently under consideration?
One of the things we're in the process of wrapping up, and we've been working on it for the last year, is to develop common indicators in some of the key sectors where we have significant investments. So if you go to our Web site, you will see that we have indicators for agriculture, roads, irrigation and land. And we've been adding some additional sectors.
We're very country-focused, so the M&E model is also very focused on that country's particular compact and tracking the results towards the goal of that country's compact. However, as a federal agency, we are also called upon sometimes to report progress by sector, particularly in areas where we have a lot of investments. For instance, 26 percent of our investment is in transport. Now roads isn't all of transport, but it's a big chunk of investments in transportation. And so, therefore, for these sectors where we have significant investments, we have come up with common indicators that are then kind of incorporated into each country's monitoring and evaluation plan, and they report on them repeatedly.
Can you explain this further?
So, one of the things that we track in every program where there is a kind of a farmer training commercialization component is the number of farmers that have been trained. If there is a business development component, then we track the number of enterprises that have been trained. So those are the common indicators that we make sure are in every single M&E plan, so that we can add the number [of] farmers who were trained in Honduras with the number of farmers trained in Ghana, with the number of farmers trained in Nicaragua. So that's a common indicator initiative. And it's something that we had to go back and do because we still kind of singularly focused on country-focused [programs] that we didn't think about the commonality aspect; so now we're including that as well in our M&E framework.
Another example of something that we're working on right now is to flush out the set of evaluations and assessments that will be developed when a compact closes out. So we're getting close, as our earliest compacts are preparing to close out, and we're starting to define very clearly what evaluations and assessments will be conducted at the close out of a compact. But in addition to that, there might be other assessments that we want to the MCA [Millennium Challenge Account] or the MCC or others to do for us from an independent assessment. So we're still working out the details of that.
Do you expect MCC priorities and evaluation strategies to change significantly under the new leadership?
My feeling is that our approach - that there won't be any significant fundamental change to our M&E. Might there be changes on the margins? Possibly.
You know, our early compacts are now, in a year or so, going to [be] drawing [to a] close. So now we're thinking about, how are we going to close them out? How are we going to document them from a process point of view, from a quality of entry point of view, from a sustainability point of view? How do we document it? How do we share it? All of those things. So I think as our agency matures, certainly there will be enhancements, modifications, changes to our model. But the core pillars of it will remain the same.
What do you think you've learned so far as an agency? Ken Hackett, a member of the MCC board of directors, said at the Sept. 8 briefing held at the agency's headquarters that MCC is a learning organization. So what is something that has been learned so far at least on the evaluation side? I'm sure there is interim monitoring going on to get a sense of what you're learning so far.
That's true. We - certainly through our monitoring system - we are able to track progress. But we always like to wait for the impact evaluation…
I will give you an example of an impact evaluation that was completed. It was not for a compact project; it was for a threshold project in Burkina Faso. And I think that was also referenced at the [Sept. 8] outreach event. That had some interesting findings; it was a pretty rigorous methodology so the findings were very robust. It showed the package of intervention that were supported in this threshold activity really worked and really did achieve the outcome it had intended to achieve and in fact exceeded them. Now that program has been kind of expanded and is now [undertaking a] compact that we now have with Burkina Faso.
It is interesting to see that your M&E policy allows in-country M&E teams to modify targets, indicators and baselines within the same compact, but they can't write a new compact. Can you tell me more about this?
Yes, it is allowed to make changes to an M&E plan, and there are often very legitimate reasons for making changes. For example, when a change might be made because when the first M&E plan was put in place, there was survey data that formed the baseline that was 5 years old. But in the year leading up to [a program], we may have paid for a new survey to establish a new baseline. That's a very legitimate reason for making a modification to a baseline.
At the time of the first M&E, we might not have known - we might have known what the end of compact target was that we were aiming for based on the economic analysis we had done for the project but because the implementer was not yet contracted yet for us. We may not have known what the interim year target are for a given activity … or there may have been targets which [were] based on hiring the contractor who was going to implement the work and their work plan. We realized that we had to change the sort the pattern of those targets.
Then there are other situations where compacts have to be restructured because their current fluctuations and whatever it is, and you thought that you could construct, I don't know, 1,000 kilometers of road with the funds allotted for the roads project. But when the bids come in, you realize that you can only construct 700 kilometers. That is a restructuring. So [that's] what we've tried to do.
So while, yes, you can make changes, we have very clearly spelled out in our M&E policy for what reason you can make a change, under what circumstances does MCC need to approve, and how high do we rachet up the approval.
For instance, if there is an end-of-compact target change and we re-run the economic analysis and find that the investment is no longer justifiable by the end of compact target change, that then requires a racheting-up of the approved change. And those changes really only take place where [there] is a significant restructuring. So those end of target changes would have to take place [or be approved] at very high places in MCC.
Have you ever encountered a situation where changes were made to the compact by MCA without gaining the approval of MCC?
The post-section [includes] our new M&E indicators and targets. This is something we only recently added to our M&E policy.
What we might have seen [is] a difference in format and how the changes were documented, and lack of clarity of the approval process. And we wanted to make that really very clear and transparent. So we saw the need. We wrote a new policy, and now we're implementing it. So MCAs understand under what circumstances changes can be made, what is allowed, what standards they need to meet, how the changes are approved as well as documented.
What strategies have you developed in order to ensure a certain amount of MCC oversight while allowing your partner countries to "own" MCC-funded development initiatives?
On the M&E side, a big factor is getting that balance right… There are two main factors: one is the complexity of the program; the second can be trying to find people with monitoring and evaluation skills.
Monitoring and evaluation isn't a typically kind of available skill set - it's an unusual skill set. In some countries, it's harder to find. In other countries, there are actually very skilled people working in those areas. We're able see impact a lot more on the M&E side if there is high capacity for M&E in the MCA. We're able to step back less if there's less capacity.