Putting Rigorous Evidence Within Reach: Lessons Learned from the New Heights Evaluation
- PDF / 710,250 Bytes
- 6 Pages / 595.276 x 790.866 pts Page_size
- 58 Downloads / 133 Views
FROM THE FIELD
Putting Rigorous Evidence Within Reach: Lessons Learned from the New Heights Evaluation Susan Zief1 · John Deke1 · Ruth Neild1
© The Author(s) 2020
Abstract Purpose This article uses an evaluation of New Heights, a school-based program for pregnant and parenting teens in the District of Columbia Public Schools, to illustrate how maternal and child health programs can obtain rigorous evaluations at reasonable cost using extant administrative data. The key purpose of the article is to draw out lessons learned about planning and conducting this type of evaluation, including the important role of partnerships between program staff and evaluators. Description This article summarizes the evaluation’s research design, data sources, and lessons learned about ingredients contributing to the successful implementation of this study. The evaluation employed a difference-in-differences design to estimate program impacts using administrative data merged across agencies. Assessment Several features of New Heights and its context facilitated an evaluation. First, New Heights leaders could clearly describe program components and how the program was expected to improve specific student education outcomes. These outcomes were easy to measure for program and comparison groups using administrative data, which agencies were willing to provide. Second, buy-in from program staff facilitated study approval, data agreements, and unanticipated opportunities to learn about program implementation. Finally, time spent by evaluators and program staff in conversation about the program’s components, context, and data resulted in greater understanding and a more useful evaluation. Conclusion The New Heights evaluation is a concrete example of how a small program with a modest evaluation budget can obtain evidence of impact. Collaborative relationships between researchers and program staff can enable these informative studies to flourish. Keywords Teen parenting · Teen pregnancy · High school graduation · Program evaluation · Administrative data
Significance
Introduction
Programs that seek to improve outcomes for parents and children are increasingly asked to provide rigorous evidence of their effectiveness. However, programs without large budgets for evaluation can be daunted by the apparent challenges of building rigorous evidence about the effectiveness of their approach. Identifying an appropriate comparison group and obtaining data at reasonable cost are two of the biggest hurdles. The good news is that there are alternative evaluation designs that do not involve expensive sample recruitment or primary data collection.
Programs that seek to improve outcomes for parents and children are increasingly asked to provide rigorous evidence of their effectiveness. Typically, programs must provide evidence that outcomes for program participants improved relative to a comparison group that did not experience the program. Funders, including government agencies and private philanthropy, may want this evidence to understand the im
Data Loading...