Å·ÃÀÈÕb´óƬ

Skip to Main Navigation
Events

Manage Successful Impact Evaluations

June 10-14, 2019

Washington, DC

Image
  • This course is designed for staff and short-term consultants responsible for managing impact evaluations in the field. It is intended to improve the skills and knowledge of impact evaluation (IE) practitioners, familiarizing them protocols designed by DIME teams based on their experience with critical issues in IE implementation, recurring challenges, and cutting-edge technologies. The course will cover impact evaluation tools and concepts, but the primary focus is on how to successfully manage impact evaluations in the field. Morning sessions consist of lectures and hands-on sessions, in which participants work together to apply what they¡¯ve learned to impact evaluation case studies. Afternoon sessions will be interactive computer-based lab sessions, giving participants a first-hand opportunity to develop skills. Lab sessions will be offered in parallel tracks, with different options based on software preferences and skill level.

    Participants will learn to:

    • Plan for and supervise high-quality surveys
    • Design and program electronic survey instruments
    • Develop a data quality assurance strategy
    • Monitor survey data and provide real-time feedback to field teams
    • Integrate M&E systems with impact evaluation
    • Manage complex survey data and produce descriptive analysis for policy makers
    • Understand how impact evaluation fits in World Bank operations
    • Effectively communicate IE results to policymakers
    • Use geospatial data for impact evaluations

     

     

     

  • Monday, June 10, 2019

    9:00 ¨C 9:30

    Welcome and introductions

    9:30 ¨C 10:00

    Impact Evaluation at the World Bank

    Arianna Legovini

    10:00 ¨C 10:30

    IE team roles and acronyms

    Saahil Karpe & Chloe Fernandez

    |   | 

    10:30 ¨C 10:45

    Coffee break

    10:45 ¨C 11:30

    Preparing for Data Collection

    Erick Baumgartner & Guillaume Gatera

    |  | 

    11:30 ¨C 12:30

    Survey Instrument Design & Pilot

    Steven Glover & Sekou Kone

    |  | 

    Tuesday, June 11, 2019

    9:00 ¨C 9:45

    Working with Survey Firms

    Nausheen Khan

    |  | 

    9:45 ¨C 10:30

    Training Data Collectors

    Aram Gassama

    |  | 

    10:30 ¨C 10:45

    Coffee break

    10:45 ¨C 11:15

    What does DIME do?

    Florence Kondylis & Sveta Milusheva

    Wednesday, June 12, 2019

    9:00 ¨C 9:30

    Impact Evaluation in the context of World Bank operations

    Vincenzo di Maro

    9:30 ¨C 10:00

    Research Ethics & Data Security

    Roshni Khincha

    |  | 

    10:00 ¨C 10:30

    Data Quality Assurance¡ªPart 1

    Margherita Fornasari & Steven Glover

    |  | 

    10:30 ¨C 10:45

    Coffee break

    10:45 ¨C 11:30

    Data Quality Assurance¡ªPart 2

    Margherita Fornasari & Steven Glover

    |  | 

    Thursday, June 13, 2019

    9:00 ¨C 9:45

    Impact evaluation for iterative learning: evidence from

    Rwanda

    Florence Kondylis & John Loeser

    9:45 ¨C 10:15

    Qualitative Research for Quantitative Impact Evaluations

    Emily Crawford

    10:30 ¨C 10:45

    Coffee break

    10:45 ¨C 11:30

    Sampling for Impact Evaluation

    Track 1 ¨C Aidan Coville

    Track 2 ¨C Maria Jones

    |  |  | 

    Friday, June 14, 2019

    9:00 ¨C10:00

    How to Effectively Communicate with Policy Makers

    David Evans

    10:00 ¨C 10:30

    Communicating with Policy Makers: A case study from Rwanda

    Innocent Musabyimana

    10:30 ¨C 10:45

    Coffee break

    10:45 ¨C 11:15

    What does DIME do?

    Aidan Coville & Vincenzo di Maro

  • In June 2019, DIME hosted the fifth annual Manage Successful Impact Evaluations course. New additions for the 2019 course included hands on sessions, where participants had a chance to apply the lectures¡¯ content to real-life impact evaluation scenarios, and a career panel, where former DIME field coordinators shared their experiences on moving on to different career paths after working in the field.  Instructors included staff from DECIE, DECDG, SurveyCTO (a computer-assisted survey software firm), and a guest lecturer from the Center for Global Development.  Presentations and training materials from the course are made publicly available through .

    66 people participated in the course in-person in Washington, DC. Most participants were World Bank staff (83%). The remaining participants came from universities or research institutes (George Washington University, Stanford University, Eastern Mennonite University), governments (including public officials from Cabo Verde, Turkey, and Argentina), NGOs (e.g. Atlas Corps, Treasureland Initiative), and survey firms.

    Image

    205 people registered to participate in the course via WebEx. A quarter of the remote participants work at universities or research institutes (e.g. Georgetown University, University of Chicago, Michigan State University, The State University of Zanzibar, University of Limpopo). 23% work in the private sector, including survey firms, consultancy firms (Deloitte, PwC, KMPG), companies such as Gray Matters India, and others. One fifth of participants are in NGOs, e.g. Save the Children, VillageReach, Atlas Corps, and TradeMark East Africa. Government officials from 12 countries (including the US, Turkey, Somalia and Peru), and United Nations agencies (WHO, WFP, UNICEF) make up around 8% of participants each. The remainder come from multilateral development banks (World Bank, the Inter-American Development Bank and the Asian Development Bank), and donors and aid agencies (DFID, GIZ, EU). 28 of the remote participants followed more than half of the five-day course, and 18 of them attended all remote sessions.

    Image

    Course participants work in a variety of regions. Most work in Sub-Saharan Africa (45% of in-person and 31% of remote), South Asia (25% of in-person and 10% of remote), or Latin America (13% of in person, 4% of remote). Among remote participants, the most common work region is North America (37%).

    In-person participants took a knowledge test at the beginning and end of the course, to measure learning outcomes. The test had 16 questions on technical topics addressed in the training, such as protocols for survey piloting and back-checks, how to select enumerators, what constitutes personal identifiers in data, the effect of take-up on statistical power, and questions on programming in Stata. We see substantial and statistically significant knowledge gains: average test scores increased by 56%. Considering the sub-categories in the test, we see the largest gains in impact evaluation methods (scores more than doubled).

    Image

    Participants found the sessions to be highly relevant for their work, and well-delivered. All courses were perceived to be well or very well presented, both in-person and on WebEx.

     

     

EVENT DETAILS