Search
Close this search box.

TOR114 – 3ie With Dr. Jyotsna Puri

Dr Jyotnsa Puri 4

Listen Now


“There are well intentioned people doing well intentioned policies, and somehow they fail.” There is a good deal of energy in the development and humanitarian space focused on building an evidence base for what works – and what doesn’t. Here on the Terms of Reference Podcast, we’ve talked with numerous individuals and organizations who are building data sets towards that end, and the International Initiative for Impact Evaluation – or 3ie – has been contributing to this conversation since its founding in 2008. To date, they’ve funded 146 impact evaluations, 33 systematic reviews and 38 other studies in over 50 countries. But how do we properly reflect on and communicate about the evidence we’ve collected, and its resulting analysis, so that it can be used by development and humanitarian actors to design (and deliver) better programing? I discuss this and a host of other topics on the 114th episode of the Terms of Reference Podcast with my guest, Dr. Jyotsna Puri. Jo is the Deputy Executive Director and Head of Evaluation at 3ie and has more than 21 years of experience in policy research and development evaluation. You can connect with Dr. Puri here: https://in.linkedin.com/in/jo-puri-54815520 https://twitter.com/jo_puri

IN TOR 114 YOU’LL LEARN ABOUT:

  • The importance of standard methods for evidence making.
  • How neutrality about approaches fosters a better understanding of evidence.
  • 3i’s flavor of impact evaluation, less systematic, more in tune with the zeitgeist and involved across stakeholders.
  • Jyotsna’s optimistic view of development conversations further mediated by evidence, and evidence standards and frameworks.

OUR CONVERSATION INCLUDES THE FOLLOWING:

Organizations

  • International Initiative for Impact Evaluation (3i) http://www.3ieimpact.org
  • Johns Hopkins University
  • Alliance for a Green Revolution Africa
  • Innovations for Poverty Action (IPA)

Topics

  • Impact Evaluation
  • Systematic Reviews
  • High Quality Data
  • Natural Resources Management Decentralization, Governance
  • Approach Neutrality
  • Big Data, Satellite
  • Behavioral Science, Implementation Science

Places

  • New Delhi, India
  • Colombia

EPISODE CRIB NOTES

3i
  1. Recognition of how there is no info on what works and what does not.
High quality data to inform. “We are looking back to wee what we are learning.” So? Impact evaluation and systematic reviews. Progress in the standardization of the practice. In 2014 impact evaluation reached 1B in funding total with nothing to show for it. “What takes people out of poverty? Can they?” Which strategy works best? Asset transfer, education? What is not working, still gets routinely funded? Payments not to deforest. Sustainability issues. Clear and present success Environment policy. Decentralized natural resources management systems. Link with agriculture, food sustainability. The evaluation process is “organic.” It involves stakeholders. “We pay attention to the zeitgeist.” Questions are asked internally. Next step are “evidence gap maps.” “It’s not as systematic as expected.” Transference and accountability in sustainable natural resources governance. There is still a lot of nothing to show from impact evaluation. Money “We started to shy from advocating policy,” including cash transfers. 3I takes policies on the standards of evidence. John Hopkins helps on validity frameworks. World’s relation with evidence 3i’s neutrality on approach helps build bridges, and respect the evidence framework. Evidence is becoming global, agendas to prefer an approach over another to become transparent. Stephen: the issue of evidence prioritization? “Not so much.” Geneva Sanitation. Impact evaluations have advanced into baseline goals and expectations for projects that deal with water sanitation. People start to understand the value of learning, which is nurtured by transparency. Going beyond insights. Keeping up with the Demands “So much exciting stuff happening.” #1 Data. Big Data, geographic, disaggregated, satellite. Alliance for Green Revolution Africa. Agricultural innovation for small farmers. High resolution soil maps. Behavioral science. People turning to inform themselves on the efforts of data. The missing link is behavioral insight. But how do decisions based on evidence look? “High quality evidence becomes larger to ignore. I’m an optimist.” Approaches are judged by results. Bias, disaggregation, statistical issues about causal-correlation. Cost and Benefit is also important. Also, who is it working for most? Evaluation methods are pretty much a reflection of PhD program curricula. 3i funding Donors. Events raise funds. Thematic windows are paid events. Agencies provide funding so 3i creates working spaces, create awareness on evaluation. A network of experts from universities and centers. IPA is a “grantee.” Big The end of development assistance at the expense of a growth in the evidence market. Governance. Climate change. Method researchers, branching out of academia. Multidisciplinary, mixed method. Implementation science. Behavioral science. Applied to sanitation and open defecation in India.

Please share, participate and leave feedback below!

If you have any feedback you’d like to share for me or Jo, please leave your thoughts in the comment section below! I read all of them and will definitely take part in the conversation. If you have any questions you’d like to ask me directly, head on over to the Ask Stephen section. Don’t be shy! Every question is important and I answer every single one. And, if you truly enjoyed this episode and want to make sure others know about it, please share it now:
[feather_share show=”facebook, twitter, linkedin, google_plus” hide=”reddit, pinterest, tumblr, mail”]
Also, ratings and reviews on iTunes are very helpful. Please take a moment to leave an honest review for The TOR Podcast!

Share the Post: