Search
Close this search box.

TOR141 ― The Power Of The Impact Evaluation Revolution With David Evans Of The World Bank

David Evans

Listen Now


I invite you to pause just for a second and take a moment to think about the last time you changed your mind about something. Specifically, I’d like for you to identify something that was either very important to you or your worldview, or something that you had taken for granted, that today you have either the complete opposite or at least a very different perspective on. Got it? Now ask yourself, what was it that made you change your mind? And, again specifically, what evidence did you unearth, or were presented with, that made the case for changing your mind?  For most of us, a profound change of mind doesn’t happen very often, but when it does, the effects of such a change alter lives, communities, and entire belief systems. As a final step in this exercise, I’d like for you to think about the core beliefs you have about the work you do in the social impact sector, and what you expect that work will help achieve for people in need. Now, ask yourself, what would it take to alter those beliefs – even if it meant radically shifting the entire system for how you’ve expected to serve others? I hope this exercise has properly set the stage for today’s 141st episode of the Terms of Reference Podcast, where we will be discussing the revolutions happening in development and aid assistance being brought about through the practice of impact evaluation. Impact evaluation holds the promise of confirming, or refuting, the effectiveness of the practices, processes and systems we rely upon to help those in need. My guest for this show, David Evans, knows a thing or two about impact evaluations. He is a Lead Economist in the Chief Economist’s Office for the Africa Region of the World Bank where he coordinates impact evaluation work across agriculture, education, health, and social protection in more countries than most people will visit in their lifetimes. I know you’re going to love this show as we discuss how we can design evaluations to learn more, how to make evaluation real time, and, ultimately how do you create an evaluation that will succeed – even if you’re working for a small NGO.   You can connect with David here: https://www.linkedin.com/in/david-evans-6924821/

IN TOR 141 YOU’LL LEARN ABOUT

  • The fundamental role of evaluation and how little awareness it often gets, and David’s illustration of the impact evaluation landscape through the light of education
  • But the increasing interest about finding methods with good evidence that work, and evidence for methods that do not
  • How the recent years have seen a sharp increase in more rigorous impact evaluation studies
  • How better methods mean thinking more broadly about positive outcomes in an evaluation, and the expanding role of the evaluator in terms of government engagement
  • David’s three trends in the future of impact evaluation: testing for simultaneous interventions, government appropriation of evaluation, and real-time
  • How rigorous, effective evaluation improves monitoring
  • The importance of control groups, and why it’s worth it to invest in statistical rigor no matter how small the intervention is. Evaluation does not have to be (too) costly and it pays in the long run

OUR CONVERSATION FEATURES THE FOLLOWING

Names:

Topics:

  • Impact evaluation
  • Treatment and control groups, statistical methods
  • Education, Test scores, Attendance, Enrollment rates
  • Positive outcomes, expected and unintended
  • Textbooks, supply, scarcity
  • Classroom sizes
  • Monitoring
  • Government engagement, internal capacity, points of contention
  • Real-time evaluations
  • Long term subject follow-up
  • Cross-country coordinated evaluation
  • Cash transfers, effect on alcohol and tobacco consumption
  • Food consumption, budgets
  • Evidence

Places


EPISODE CRIB NOTES

«One of the things that’s become the most clear in this impact evaluation revolution is that good sense just isn’t enough. So again and again, we have impact evaluation programs that really seem sensible. Smart people agree. This really should work: giving textbooks to students in Sierra Leone or giving microcredit as a poverty alleviation tool. Real impact evaluation can overturn these high held beliefs»   Washington, DC   03:38 Senior economic heavy lifter, Africa Region, World Bank
  • David’s focus has been working with low to middle-income countries
  • How social programs (health, education) are having the desired effect
  • Are the public investments actually helping people?
  • David sees a heightened interest about methods that shed light on what works
  • In Africa, Latin America and the Caribbean, David has reviewed education programs
  • He currently works on a global education scale
  • This year, the annual World Bank report is on education
  • Spoiler alert: Results vary. “Over the past 20 years we have had success in increasing enrollment rates in middle-income countries, getting desired educational results” by officials and families standards
  • But, there are interventions that really help educational: roofs, quality of textbooks, quality of teaching and teachers. “This is where we see the big gains in learning”
  06:25 Any specifics?
  • A study by Poverty Action Lab and Profam (?) in India, compared teaching between states
  • One state has more diverse student populations than others
  • For an hour a day, schools sorted students by skill instead of grade
  • They received targeted teaching
  • This practice has some time happening, with organizations supporting its implementation and scalability
  • It was held as a good practice, but there was not clarity on how good it was
  • Textbooks were also involved. They are often not enough for everyone
  • Everyone knows more textbooks is good, but how? It is decided to deliver books incrementally and compare if getting books first improves school test scores
  • Plot twist: no significant change as a result of having a textbook
  • Digging deeper: turns out books make it into the school, not always into the classroom. Kids stay without access. “Sometimes with good reason” as teacher try to protect them, make them last. Not like corruption or anything
  • “Throwing resources at schools does not deliver the gains we want”
11:04 Evaluation, fickle mistress
  • “Let’s use education as a case”
  • History of impact in evaluation in education is a handful of efforts in the last 20 years
  • “I mean those with a control group and rigorous comparability”. Back to the textbooks
  • Attention to distill the factors is quite new
  • But in the past few years, there’s over 120 evaluations only for education
  • They focus on learning outcomes, enrolments, attendance
  • “They are using better and better methods”
  • Now these are results you can count on
  • “Another big shift is the government involvement on a larger scale”
  • Before, evaluations brought doubt at extrapolating small dev house outcomes into the larger context
  • Now the government direct needs, which have a national dimension
  • Going forward, David sees three things:
  • #1 (Underway) Government wants evaluation on a new program, using randomized control testing. Multiple-factor testing measures joint effects of simultaneous interventions
  • #2 Real time. Moving beyond (or in addition to) surveys. “A year is too long”. Immediate decision-making is important. Some new players are betting on real-time education eval
  • #3 Government assuming the impact evaluation themselves. Bits of it are showing in Mexico, Brazil. But usually these rely on outside expertise. Tanzania has well-trained scholars. In-N-out cooperation is great, but from within government management can react quicker and bring sophistication to culture, up the ante
  19:50 When the evaluation failed
  • “I’m going to flip it around: when do evaluations succeed well?” Something that makes a huge difference: government active since the beginning, on board with a non-binary design (more than thumbs-up, thumbs-down involvement)
  • When an evaluator “shows up with some result”, worse if it is unflattering, government can easily push back and discredit the result
  • It’s always a more nuanced story
  • The evaluator should take it as critical part of the job to engage with the government early and often
  • Ideally there must be a dialog that can quickly influence programs
  • What to do when a program reveals positive outcomes, just not those the program was intended for? An education intervention showed great results in health
  • The evaluators do require independence from the government, from the design of the evaluation instrument to the data collection and crunching
23:58 You said ‘evaluation’, I thought you said ‘contention’
  • The collaborative element “removes a lot of the contention”
  • David has been on both sides, has felt the tendency to downplay or highlight results
  • Fight the temptation to keep the “meat” at the end. As evaluator, take advantage of the opportunity to add value throughout the intervention’s run
  • In Gambia, David was at an evaluation. A government official said: “the problem is teachers don’t know what they are teaching”
  • The evaluation was not about that
  • But a follow-up survey was due
  • “Let’s add that question to the survey”
  • mfw the government official was correct
  • > lots of teachers were yet to master the curriculum
  • the broader evaluation continued
  • but the midpoint survey was the starting point for a different intervention
  • the evaluator should anticipate and defuse points of contention before they get out of hand
  27:42 What David eyes have seen
  • “Real-time evaluations will be a game-changer” without taking room out of the long-scale ones
  • In education, 90% of final data collection activities took place less than a year after the intervention first started
  • “We need more evidence on the long-term”
  • Measure kids 10y from now
  • (treatments and control groups)
  • Real-time evaluation innovation set to grow
  • Cross-country coordinated evaluation should also go on
  • Six countries were compared by their microcredit intervention effects
  • Overall low effects on poverty alleviation
  • Growing collection of evidence allow for more certainty with international validity
  • International positive effects are especially nice
  • Cash transfer concerns for new countries
  • “People will splurge free money”
  • As it turns out, almost nobody was looking into levels of cash transfer spending in alcohol, tobacco, the likes
  • “A colleague of mine and I reviewed the existing literature. We brought it together”
  • In virtually every case, there is no increase in consumption
  • When you pick up you money, an officer reminds you ‘it’s for your children’
  • There’s no evidence cash transfers correlate with alcohol, tobacco consumption
  • Stephen: Similar common wisdom on land allocation (Check out our TOR Guest 105, Annie Duflo from IPA)
  33:09 Hows do standard results fly in the face of cultural relativism?
  • “We’re all playing with a different hand, but we may have cards in common”
  • While interventions can show consistent results across countries, and relationships might hold up, a deeper look often shows clear differences
  • Classroom sizes
  • Relationship with learning outcomes vary hugely country to country
  • “You want to test and learn, case by case”
  • “Best evaluation improves the quality of monitoring”
  • An impact evaluation project in Kenya
  • It focuses on hospitals, quality of care
  • A scorecard was made for evaluation
  • The government took the scorecard
  • It uses it to monitor quality of care beyond the program
  • Quality of data collection is better now
  • System is better off
  • Good evaluation benefits monitoring quality
37:35 Rigor V. Depth V. Impact V. Boutique NGO
  • How resource-intensive is this kind of impact evaluation?
  • “True. Small NGOs just want to serve people”
  • “Why would I even collect data on a control group that sees no benefit”
  • Two things to keep in mind:
  • #1 Evaluation does not have to be expensive
  • Care about 2, 3 indicator
  • Gather data for them
  • Just get a credible way to plot change as a result of the intervention
  • Find a credible enough comparison group
  • Small NGOs can actually do it if they don’t have reach
  • They begin on one place, gathering data on others before they can fund intervention there
  • At the end of the day “good sense is not enough”
  • No matter how confident the experts are
  • If they have no data
  • (#2?)
  • Back to the cash transfer case in Tanzania
  • It was supposed to trigger further consumption
  • That did not happen
  • Households did not have more meals a day
  • Households did not have fuller meals
  • Treatment group people are interviewed
  • “We spent the cash on food”
  • “We did a little home improvement”
  • “But we spent the cash on food”
  • No you didn’t, our data reveals no such things
  • A deeper look at data reveals food consumption did increase
  • Just not on the household, but in additional household
  • It was a particularly good economic time in Tanzania
  • People were doing a little bit better
  • They were investing in food
  • Everyone, not only treatment group people
  • Then came the cash transfers
  • They bought better health care with the cash transfers
  • And shoes for children
  • “It’s so important to gather data on a comparable group”
42:53 David stars
  • David (and others) writes on World Bank Impact Evaluation Blog
  • Chris Blattman from U. Chicago
  • Poverty Action Lab
  • Innovations for Poverty Action
  • World Bank Development Impact Evaluation Group (DIME)
  • World Bank Strategic Impact Evaluation Fund
  • Civil Service Reform
  • Impact Evaluation in Low and Middle Income Country Bureaucracy

 

Please share and participate

If you have any questions you’d like to ask me or David directly, head on over to the Ask Stephen section. Don’t be shy! Every question is important and I answer every single one. And, if you truly enjoyed this episode and want to make sure others know about it, please share it now:
[feather_share show=”facebook, twitter, linkedin, google_plus” hide=”reddit, pinterest, tumblr, mail”]
Also, ratings and reviews on iTunes are very helpful. Please take a moment to leave an honest review for The TOR Podcast!

Love this show? Tell us about why (or why not) below:

Share the Post: