Promoting Healthy Development of Young People: Outcomes Framework 2.0

In the summer of 2018, the Local Government Association (LGA) in England commissioned the Centre for Youth Impact to produce an outcomes framework to help partners across the English youth sector to develop and agree on mutual aims to support young people in their local areas. The work was in response to LGA’s consultations that led to its vision statement described in the report, Bright Futures: our vision for youth services, published at the end of 2017. In that report, the authors noted:

“A clear outcomes framework can help to effectively monitor the impact of a service at key milestones to spot where things aren’t working and provide opportunities to make changes where needed. It can also support evidence of collective impact across the system.”

The proposed framework was intended to support partners’ efforts to track and understand the short-, medium-, and longer-term impacts of their work on the lives of young people. The framework needed to be simple and adaptable to provision for different groups of young people and for diverse approaches.

This document is an update on the framework and is the result of two phases of work: an initial phase including desk research and widespread consultation with practitioners, commissioners and elected members, and a second phase to test the proposed framework in action. The work was undertaken by the Centre’s network of regional impact leads and its central team.

Measure Once, Cut Twice: Using Data For Continuous and Impact Evaluation in Education Programs

Frustration and confusion often occur when practitioners require detailed information about program processes for continuous quality improvement (CQI) while policy-makers require evidence of outcome effects for accountability and funding.  Impact studies are often preferred over continuous improvement studies, but they seldom offer useful information to practitioners.  Per the conference theme, this situation leads to a worldview that emphasizes the limitations of social science methods for achieving practical purposes and welcomes arbitrary decision making (i.e., Type-2 error) in the absence of better evidence and arguments. 

This paper describes a generic quality-outcomes design (Q-O design) that meets the need for performance measurement methodology for concurrent and integrated impact evaluation and continuous improvement in the same organization; that is, measure once, cut twice

Quality-Outcomes Study for Seattle Public Schools Summer Programs

This quality-outcomes study was designed to both (a) describe performance in Seattle Public Schools (SPS) summer learning programs in ways that are useful to staff and (b) provide evaluative evidence (i.e., validity) for an instructional model that includes challenging academic content and responsive instructional practices.

Results from this study were mainly positive yet partially ambiguous. Summer program offerings were well-attended and characterized by high-quality instructional practices, with a majority of students increasing their literacy and math skills during the program. Findings about the association between exposure to more responsiveness instruction (e.g., quality) and academic skill change were mixed.

Results include:

Positive academic skill change was found in the raw data, including for academically at-risk students. Positive change on the academic performance measures used during the summer program was found for 73% of students, and positive change on the academic achievement tests was found for 74% of students from the 2015 to 2016 school year. Standardized effect sizes for the full sample ranged from medium to large (i.e., dz = .56 – .95) across the seven academic skill measures.

Attendance was regular, and instructional responsiveness was consistently high. Summer program attendance for 21 or more days (out of a total possible 27 days) was observed for 77% of students. Analysis of instructional responsiveness using the Summer Learning PQA revealed three profiles of instructional responsiveness at the point of service: high, medium, and low quality. However, compared to other urban samples, the “low” SPS profile is not very low.

Students in SPS summer programs had similar rates of skill change across profiles of instructional responsiveness in the most rigorous models for 3rd and 4th grade students (N = 535); that is, there was insufficient evidence in support of the hypothesized pattern of differential skill change across profiles of instructional quality. However, these results should be interpreted with caution due to the absence of a true low-quality instructional practices subgroup in the sample. Less statistically rigorous but more theoretically well-specified models for the entire K-4 sample (N = 1060) revealed a positive association between instructional quality and academic skill change, despite the lack of a true low-quality subgroup.

Analyses of academically at-risk students revealed similarly mixed results. In the more statistically rigorous models with grades 3-4, students who entered SPS summer programs below proficient on academic achievement tests for the prior school year (2015-16) showed similar rates of academic skill change across profiles of instruction. In the theoretically well-specified models, academically at-risk students showed greater changes in academic skills in summer programs with higher-quality instructional practices.

Evaluation of Afterschool Improvement Process: Oklahoma 21st Century Community Learning Centers

Since 2007, the Oklahoma State Department of Education has operated a quality improvement system (QIS) for its approximately 100 federally-funded 21st Century Community Learning Centers (OK 21CCLC) afterschool programs with the explicit purpose of improving the performance of these service providers. This report draws upon data from 23 performance measures collected annually over multiple annual program cycles to present findings for reliability, validity, performance change, and effect of intervention fidelity on performance change. These analyses were conducted as part of an ongoing effort to: (a) evaluate over-time change in performance that is the central purpose of the QIS and (b) improve the accuracy and usefulness of performance data available to individual organizations that participate in the QIS.

In general, our findings indicate that the Oklahoma Afterschool Improvement Process is performing in accordance with its purposes: using accurate performance data to incentivize improvement in the quality of services.

Findings for the reliability and validity of the measures include:

  • All of the 23 measures demonstrated acceptable levels of reliability.
  • There is evidence for construct validity at each time point and factorial invariance across time points.

Findings for performance improvement include:

  • Nearly all measures incrementally improved during a four year (2010-2013) period, while a subset demonstrated statistically significant growth.
  • For nearly all measures, lower-performing sites at the baseline year (2010-2011) improved most. A subset of models demonstrated statistically significant effects.
  • The indicator with the largest increase over four years was Targeting At-Risk Students, suggesting that even though the students served became more challenging, service quality was also generally improving.

Findings for intervention fidelity include:

  • Higher fidelity of YPQI implementation is positively associated with growth on nearly all performance measures at over half of all year-to-year time increments, in line with the YPQI theory of change

This report is supplement to a series of annual reports submitted to the Oklahoma State Department of Education over eight years. These reports provide the unadjusted information that was used in the models described in this report. The supplement to the annual performance report for the 2013-14 program year (Sniegowski, Gersh, Smith, & Garner, 2015) provides the unadjusted means and descriptive statistics for all of the items and scales in the study.

Design Study for the Summer Learning Program Quality Intervention (SLPQI)

The Summer Learning Program Quality Intervention (SLPQI) is a continuous improvement intervention for summer learning systems and settings. The intervention includes: (a) standards and measures for high-quality instructional practices, (b) data products and technology for meaningful feedback, (c) a plan-assess-improve cycle at each summer site, and (d) supports necessary to design and implement the prior three parts. The SLPQI focuses on instructional practices that build student skills during summer and increase school success during subsequent school years.

The SLPQI was the subject of a four-year Design Study involving 152 providers in seven cities. In the final year of the study, the SLPQI was implemented citywide in Denver, CO; St. Paul, MN; and Seattle, WA (N = 106 sites). This report presents final specification of the SLPQI design, supports, measures, and performance benchmarks.

Key findings include:

The SLPQI was implemented at moderate to high fidelity, at scale, in three citywide systems with local provision of supports. The proportion of sites implementing the SLPQI at high fidelity was high in all three systems, and partnerships of school districts, city agencies, community-based providers, and quality intermediary organizations developed capacity to implement the SLPQI at scale. A large proportion of non-school-based sites were connected with information about students’ success in the prior school year.

Summer program staff positively valued the SLPQI and the assessor-coach role. System leaders, site managers, and assessors reported that implementation of the SLPQI was a good use of their time and a good fit with their work. They also reported that the Summer Learning Program Quality Assessment (PQA) successfully differentiated between higher and lower quality. Staff valued of the assessor-coach who observed, generated performance feedback, and provided coaching for the site manager.

Performance data indicates that instructional quality and student outcomes improved as predicted by the SLPQI theory of change. Performance data indicates that instructional quality improved from 2015 to 2016. Lower-performing sites improved the most, and high performance was sustained. Innovations were focused on identified areas of low quality: student management of their executive skills, motivation, and emotions. Students in higher-quality summer settings had greater academic skill gains in both 2015 and 2016 compared to students participating in lower-quality summer settings.

Recommendations include (a) marketing the SLPQI in cities with strong summer partnerships; (b) marketing SLPQI to school districts that hope to build summer partnerships; (c) continuing efforts to improve the Summer Learning PQA as a standard for high-quality instruction tailored specifically for students with difficult SEL histories, and (d) conducting a randomized efficacy trial for the SLPQI.

Preparing Youth to Thrive: Methodology and Findings from the SEL Challenge

The Social and Emotional Learning (SEL) Challenge was undertaken in pursuit of two ambitious goals: To identify promising practices for building SEL skills with vulnerable adolescents, and to develop technical supports for use of these SEL practices at scale in thousands of out-of-school time (OST) settings. The study design included a qualitative methodology, expert practitioners, and performance studies at each of eight exemplary programs. The products of the Challenge—standards for SEL practice and the suite of SEL performance measures—is designed to help OST programs focus deeply on SEL practice, assess their strengths, and improve the quality and effectiveness of their services using a continuous improvement approach.

By focusing systematically at a granular level of adult and youth behavior, the Challenge content supports use in practice-oriented settings and systems—youth programs, school day classrooms, mentorships, residential treatment, apprenticeships, workplace, families—where the qualities of adult-youth interaction and learning are a primary concern. We hope that local policy makers and funders will use the Challenge as a template for identifying the exemplary SEL services already available in their communities and make sure that they are adequately recognized, resourced, and replicated.

The promising practices [were] featured in a Field Guide, Preparing Youth to Thrive: Promising Practices for Social and Emotional Learning (Smith, McGovern, et al., 2016), a companion website, and a suite of tools and technical assistance. This report, Preparing Youth to Thrive: Methodology and Findings from the SEL Challenge, describes how the partnership carried out the work of the Challenge and what we learned as a result. Findings from the SEL Challenge include:

  1. The Challenge methodology successfully identified exemplary SEL offerings and produced 34 standards, 78 practice indicators, and 327 vignettes for building SEL skills with vulnerable youth.
  2. The suite of performance measures developed for the Challenge is feasible to implement and demonstrates sufficient reliability and validity for both continuous improvement and evaluation uses.
  3. The performance studies indicate that the exemplary offerings were exceptionally high quality compared to other OST programs and that youth skills improved in all six SEL domains. Skill growth also occurred for the higher risk groups. Benchmarks for SEL performance include: (a) Diverse staff and youth, intensive participation, and expert adult guidance; (b) Collaborative organizational cultures; (c) Exceptionally high quality instruction and youth engagement; (d) A consistent pattern of positive SEL skill growth across measures, offerings, and risk status.
  4. The offerings shared an OST-SEL intervention design: project-based learning with intensive co-regulation.

The Discussion section addresses generalizability of findings, cautions about SEL measurement, and study limitations.

Preparing Youth to Thrive: Promising Practices for Social Emotional Learning

Executive Summary

The Social and Emotional Learning (SEL) Challenge was designed to identify promising practices for building skills in six areas: emotion management, empathy, teamwork, initiative, responsibility, and problem-solving. The Challenge was a partnership between expert practitioners (youth workers, social workers, teachers) delivering exemplary programs in eight unique communities, a team of researchers, and a national foundation.

Although each of the exemplary out-of-school-time (OST) programs that were studied uses a different curriculum, their approaches to building social and emotional skills have important similarities, and these are the subject of the guide. This guide presents 32 standards and 58 indicators of SEL practice in six domains as well as four curriculum features that were shown to be foundational for supporting SEL practices.

For teens, social and emotional learning helps build resiliency and a sense of agency—skills critical for navigating toward positive futures of their own design. Social and emotional skills are the skills for action that help youth on that path. These skills go by several names: 21st-century skills, soft skills, and character education, and are experiential learning, positive youth development, etc. We focused on translating the “action” that staff and youth see in exemplary out-of-school-time programs into plain language. The guide sets things to share widely and in plain language how professionals can embed practices that support social and emotional learning with greater intentionality.

This guide is designed to start conversations about the kinds of social and emotional skills readers hope will grow in the adolescents they know and care about and to support the adult practices that help these skills to grow. We hope that readers will use the guide to create and pursue their own action plans for implementing SEL in their OST programs and networks. The guide is designed for readers to use on their own terms, not as a book to be read front-to-back—advice to readers is provided at the end of the introduction.

Framing an Evidence-Based Decision About 21st CCLC

Summary

In this commentary we’ve described a mismatch between the afterschool theory of change and the intent-to-treat evaluation design, suggesting that when these powerful evaluation designs are applied to broad developmentally focused programs such as 21st CCLC, the effect sizes are likely to be small but substantively important. We’ve also suggested that afterschool evaluations need to include description and measurement of critical program qualities and the specific skills these programs are focused on growing. An example of a 21st CCLC evaluation which combines both an intent-to-treat impact design with a substantial effort to understand how all of the pieces fit together is the Texas 21st CCLC Year 2 Interim Evaluation Report. We further suggested that 21st Century has intentionally fostered an important social innovation, the afterschool QIS. Because 21st CCLC represents ethic of accountability for service quality in many states, the program has also sparked a broader social movement that has improved the state of afterschool for all American youth.

Afterschool Quality

This research article discusses efforts to define and improve the quality of afterschool services, highlighting areas of agreement and identifying leading-edge issues. We conclude that the afterschool field is especially well positioned to deliver high-quality services and demonstrate effectiveness at scale because a strong foundation has been built for continuous improvement of service quality.