Impact Evaluation for the Palm Beach County Quality Improvement System 

Quality Improvement System (QIS) exposure moves afterschool programs to higher quality, increasing access to developmentally powerful settings and building children’s social and emotional learning skills. Higher quality is defined in terms of the quality of instruction (i.e., individuation, basic/advanced SEL, enrichment content), the stability of staff tenure, and evidence of children’s SEL skill growth.

In this study, we used performance data generated by Prime Time Inc. in Palm Beach County and fully pattern-centered methodology to describe the chain of causal effects as a cascade of sequential impacts. We sought to answer two specific questions about implementation and children’s SEL skill growth: What is the impact of QIS exposure on program quality (i.e., best practices, low staff turnover, great content), particularly for programs that have lower program quality at baseline? What is the impact of exposure to high program quality on student SEL skills?

Findings demonstrate that (1) QIS exposure causes program quality improvement to occur and (2) exposure to high quality corresponds to SEL skill growth. Specifically, (1.a) quality increased dramatically over three years of exposure to the Palm Beach County QIS; (1.b) programs with Low Quality at QIS entry improved when exposed to even moderate QIS Fidelity; (2.a.) children exposed to higher-quality programs had greater SEL skill maintenance and gains compared to children exposed to lower-quality programs; and (2.b) children with Low SEL Skill at entry made greater gains at all levels of program quality.

This pattern of findings suggests that the Prime Time QIS design is successfully building the quality of services available in the county in substantively meaningful ways – by increasing the quality of instruction, increasing the tenure of staff, and growing SEL skills for students who need it most.

Findings from the Self-Assessment Pilot in Michigan 21st Century Learning Centers

Overall 24 sites within 17 grantees participated in the self-assessment pilot study by assembling staff teams to collect data and score the Youth Program Quality Assessment (PQA).

At each site an average of 5 staff spent an average of 13 staff hours to complete the self-assessment process.

Whether using an absolute standard or group norms as a benchmark for interpretation of data from the Youth PQA Self-Assessment Pilot Study (hereafter called the Pilot Study), quality scores were very positive for participating programs and also reflected the tendency of self-assessment scores to be biased toward higher quality levels.

The quality scores followed the same pattern as outside observer scores in other samples, highest on for issues of safety and staff support and lowest on higher order practices focused on interaction and engagement.

Youth PQA data collected using the self-assessment method demonstrated promising patterns of both internal consistency and concurrent validity with aligned youth survey responses.

Two thirds or more of sites reported that the observation and scoring process helped the self-assessment team to have greater insight into the operation of their programs, talk in greater depth about the program quality than usual, and have more concrete understanding of program quality.

Site directors and local evaluators said that the self-assessment process was a source of good conversations about program priorities and how to meet them. In almost all cases, concrete action followed from the self-assessment process.

Site directors and local evaluators demonstrated the ability to improvise the self-assessment method to fit local needs.

Program directors, site coordinators, and local evaluators have used the Youth PQA and statewide Youth PQA data to generate statewide program change models, suggesting that the instrument and data are useful for setting system-level improvement priorities.