Quality Improvement System (QIS) exposure moves afterschool programs to higher quality, increasing access to developmentally powerful settings and building children’s social and emotional learning skills. Higher quality is defined in terms of the quality of instruction (i.e., individuation, basic/advanced SEL, enrichment content), the stability of staff tenure, and evidence of children’s SEL skill growth.
In this study, we used performance data generated by Prime Time Inc. in Palm Beach County and fully pattern-centered methodology to describe the chain of causal effects as a cascade of sequential impacts. We sought to answer two specific questions about implementation and children’s SEL skill growth: What is the impact of QIS exposure on program quality (i.e., best practices, low staff turnover, great content), particularly for programs that have lower program quality at baseline? What is the impact of exposure to high program quality on student SEL skills?
Findings demonstrate that (1) QIS exposure causes program quality improvement to occur and (2) exposure to high quality corresponds to SEL skill growth. Specifically, (1.a) quality increased dramatically over three years of exposure to the Palm Beach County QIS; (1.b) programs with Low Quality at QIS entry improved when exposed to even moderate QIS Fidelity; (2.a.) children exposed to higher-quality programs had greater SEL skill maintenance and gains compared to children exposed to lower-quality programs; and (2.b) children with Low SEL Skill at entry made greater gains at all levels of program quality.
This pattern of findings suggests that the Prime Time QIS design is successfully building the quality of services available in the county in substantively meaningful ways – by increasing the quality of instruction, increasing the tenure of staff, and growing SEL skills for students who need it most.
During times of crisis when programs are under tremendous pressures, evaluation and assessment can be challenging. Programs enter triage mode, putting their limited time and energy into the most urgent tasks. This heightens the need for evaluation that reduces strain and improves capacity. When conditions that created the crisis are long-lasting, like the coronavirus pandemic, it becomes necessary to revisit and restore vital activities that may have been moved to the backburner, but to do this successfully often requires intelligent redesign. How can the same needs be met in a new way? How can evaluation and assessment be adapted to succeed in challenging conditions?
QTurn has developed a comprehensive evaluation plan for afterschool programs at a moment when redesign of the service and delivery of the redesigned service are happening at the same time. This plan, ”Afterschool Evaluation Plan 2020,” was developed to address the unique needs of programs in the 2020-2021 school year – and support compliance with the specific requirements for the 21st Century Community Learning Centers. An evaluation plan for 2020 must remove burdens rather than adding to them and make life easier not harder. Aware of these needs, QTurn’s design includes short, validated assessment tools, guidance available through online trainings, and a reassuring, therapeutic approach referred to as the “lower stakes” model.
A Lower Stakes Approach
The lower stakes model is a strong component of QTurn’s work. Lower stakes means that the results of assessments are used to support and inform, and program staff are able to interpret the meaning of their own individual and group performance data. In a lower stakes model, the results of assessments do not influence funding or prompt sanctions. Instead, low assessments scores are opportunities for mutual learning, support, and growth.
While this approach has been integral to the work of QTurn’s founder, Charles Smith, for decades, lower stakes is especially critical in the 2020-2021 school year as program staff strive to adapt to a new normal. The AEP 2020 is intentionally designed to alleviate stress and confusion and help staff adapt to rapid change and achieve shared meaning.
User-Friendly Assessment Tools
The Afterschool Evaluation Plan 2020 includes three assessment tools designed for remote or in-person programming (or a combination of both). The first measures fidelity to best practices at the management level. The second captures quality at the point-of-service (and applies to home learning environments). The third charts the growth of social and emotional learning (SEL) skills among youth. The tools are short and easy to use. Designed to work together as part of the cycle, they also support impact evaluation.
Management Practices Self Assessment (MPSA).The MPSA was developed with extensive input from 21st CCLC program project directors and aligns with the core requirements for 21st CCLC programs in Michigan. With 24 indicators forming eight standards in four domains, the tool requires less than two hours for program managers to complete.
Guidance for Out-of-School Time Learning at a Distance (GOLD) Self-Assessment.Site managers and staff complete a self-assessment produced with extensive input from other expert 21st CCLC site managers. GOLD contains 27 indicators that form 11 standards in four domains. The four domains represent point-of-service quality in the individual learning environment.
Adult Rating of Youth Behavior (ARYB).Each child is rated on the ARYB in November and April. By completing the assessment at two time points (earlier in the school year then again toward the end of the year), the ARYB is able to capture growth in social and emotional skills across the school year. The ARYB has 30 items that form six skill domains, including emotion management, teamwork, and responsibility.
Additional Features:
Cost-Effective. The Afterschool Evaluation Plan (AEP) 2020 can be adapted to a wide range of cost structures. The Guidebooks and scoring forms are available for free to all users. Additionally, the demands on time are low. Completing the assessments can require as little as 2 hours for program directors and 3 hours for site managers and staff per year.
Guidance Through Online Training. QTurn will offer live online trainings covering the use of the MPSA, GOLD, and ARYB. Support also includes online training that equip leaders and staff to do data informed planning.
Emphasis on School-Day Alignment. QTurn’s AEP 2020 helps programs pivot toward greater integration with schools during a time when school has become more challenging for many children.
Support for Impact Evaluation. Finally, data obtained using the assessment tools can be used to evaluate the overall impact of programs, particularly across multiple programs.
To adopt the AEP 2020, begin by downloading the assessment tools and resources.
For support with implementation of the AEP 2020, please contact the QTurn Team.
Overall 24 sites within 17 grantees participated in the self-assessment pilot study by assembling staff teams to collect data and score the Youth Program Quality Assessment (PQA).
At each site an average of 5 staff spent an average of 13 staff hours to complete the self-assessment process.
Whether using an absolute standard or group norms as a benchmark for interpretation of data from the Youth PQA Self-Assessment Pilot Study (hereafter called the Pilot Study), quality scores were very positive for participating programs and also reflected the tendency of self-assessment scores to be biased toward higher quality levels.
The quality scores followed the same pattern as outside observer scores in other samples, highest on for issues of safety and staff support and lowest on higher order practices focused on interaction and engagement.
Youth PQA data collected using the self-assessment method demonstrated promising patterns of both internal consistency and concurrent validity with aligned youth survey responses.
Two thirds or more of sites reported that the observation and scoring process helped the self-assessment team to have greater insight into the operation of their programs, talk in greater depth about the program quality than usual, and have more concrete understanding of program quality.
Site directors and local evaluators said that the self-assessment process was a source of good conversations about program priorities and how to meet them. In almost all cases, concrete action followed from the self-assessment process.
Site directors and local evaluators demonstrated the ability to improvise the self-assessment method to fit local needs.
Program directors, site coordinators, and local evaluators have used the Youth PQA and statewide Youth PQA data to generate statewide program change models, suggesting that the instrument and data are useful for setting system-level improvement priorities.