Preparing Youth to Thrive: Promising Practices for Social Emotional Learning

Executive Summary

The Social and Emotional Learning (SEL) Challenge was designed to identify promising practices for building skills in six areas: emotion management, empathy, teamwork, initiative, responsibility, and problem-solving. The Challenge was a partnership between expert practitioners (youth workers, social workers, teachers) delivering exemplary programs in eight unique communities, a team of researchers, and a national foundation.

Although each of the exemplary out-of-school-time (OST) programs that were studied uses a different curriculum, their approaches to building social and emotional skills have important similarities, and these are the subject of the guide. This guide presents 32 standards and 58 indicators of SEL practice in six domains as well as four curriculum features that were shown to be foundational for supporting SEL practices.

For teens, social and emotional learning helps build resiliency and a sense of agency—skills critical for navigating toward positive futures of their own design. Social and emotional skills are the skills for action that help youth on that path. These skills go by several names: 21st-century skills, soft skills, and character education, and are experiential learning, positive youth development, etc. We focused on translating the “action” that staff and youth see in exemplary out-of-school-time programs into plain language. The guide sets things to share widely and in plain language how professionals can embed practices that support social and emotional learning with greater intentionality.

This guide is designed to start conversations about the kinds of social and emotional skills readers hope will grow in the adolescents they know and care about and to support the adult practices that help these skills to grow. We hope that readers will use the guide to create and pursue their own action plans for implementing SEL in their OST programs and networks. The guide is designed for readers to use on their own terms, not as a book to be read front-to-back—advice to readers is provided at the end of the introduction.

Framing an Evidence-Based Decision About 21st CCLC

Summary

In this commentary we’ve described a mismatch between the afterschool theory of change and the intent-to-treat evaluation design, suggesting that when these powerful evaluation designs are applied to broad developmentally focused programs such as 21st CCLC, the effect sizes are likely to be small but substantively important. We’ve also suggested that afterschool evaluations need to include description and measurement of critical program qualities and the specific skills these programs are focused on growing. An example of a 21st CCLC evaluation which combines both an intent-to-treat impact design with a substantial effort to understand how all of the pieces fit together is the Texas 21st CCLC Year 2 Interim Evaluation Report. We further suggested that 21st Century has intentionally fostered an important social innovation, the afterschool QIS. Because 21st CCLC represents ethic of accountability for service quality in many states, the program has also sparked a broader social movement that has improved the state of afterschool for all American youth.

Afterschool Quality

This research article discusses efforts to define and improve the quality of afterschool services, highlighting areas of agreement and identifying leading-edge issues. We conclude that the afterschool field is especially well positioned to deliver high-quality services and demonstrate effectiveness at scale because a strong foundation has been built for continuous improvement of service quality.

Moving the Needle on “Moving the Needle”

Summary

This paper introduces the nomenclature of performance-based accountability systems (PBAS) to the expanded learning field, provides a policy case study for a countywide system in southern Florida and uses data from that system to explore the issue of quality thresholds. We present an expanded design standard to guide development and improvement of PBAS policies and further develop a theory of lower-stakes accountability to guide effective use of incentives of various types. Findings suggest that (1) the PBAS framework defines critical concepts and improves our ability to describe existing quality improvement systems, (2) the Youth Program Quality Assessment (Youth PQA) can be used to produce a program rating of sufficient reliability for use in a PBAS, and (3) that the Palm Beach County PBAS
design is an exemplar for expanded learning policies.

General recommendations for PBAS designs include:

  • PBAS design should differentiate roles and link performance measures to incentives targeted at specific management and service delivery roles.
  • PBAS designs should include program ratings for multiple service domains linked to a mix of higher- and lower-stakes incentives.
  • PBAS should emphasize participants’ understanding of performance levels and sense of fairness while evolving toward higher-stakes incentives over time.

Detailed recommendations for Weikart Center clients using the Youth Program Quality Intervention and related Program Quality Assessments as the basis for an expanded learning PBAS design include:

  • Recommendations for best practice in each element of the seven elements in PBAS design
    standard.
  • Detailed description of a composition map for program ratings and performance levels for nine commonly used measures in expanded learning PBAS.
  • A PBAS design exemplar based on the Palm Beach County case describing specific combinations four types of incentives (financial, customer review, supervisory review, access to data) with two types of performance levels (high and low) and nine program ratings to achieve an optimal, lower-stakes, PBAS design with higher-stakes elements.

Measuring Youth Skills in Expanded Learning Systems: Case Study for Reliability and Validity of YDEKC Skill Measures and Technical Guidance for Local Evaluators

Weikart Center Expanded Learning Initiative, Technical Working Paper #4 for the Ready by 21 Project at the Forum for Youth Investment

Summary

  • YDEKC has made great progress toward development of skill measures for expanded learning service providers that serve multiple purposes of community positioning, performance improvement, and proof of program effectiveness. Already YDEKC’s efforts have advanced the field toward the most important questions: What are the important skills of interest for the expanded learning field? How do expanded learning settings cause change in these skills?
  • The current set of YDEKC measures (Table 1) are valuable for positioning in relation to community goals because they state the intentions of YDEKC providers. These measures utilize scales that are reliable (defined as internal consistency) but have weak evidence for construct validity because many of the scales and items are highly correlated.
  • An improved set of skill measures (Table 8) can be extracted from the YDEKC skill measures with more sufficient evidence of reliability, construct validity, and additional evidence for convergent validity. This structure was replicated in important subgroups in the YDEKC sample, including middle school youth, high school youth, and at-risk youth.
  • Additional evidence for convergent validity includes:
    • External measures of program quality are positively associated with youth reports of the program fit for skill building.
    • Youth reports of the program fit for skill building are positively associated with most of the other youth skill measures.
    • Measures related to managing academic work are positively associated with youth reports on school success measures, including grades and attendance in the past month.
  • YDEKC data can be used to create multi-variate skill profiles that better reflect the integrated nature of skill learning and demonstration. These profiles indicate that a subgroup of youth in lower skill profiles can be identified and that these youth are spread across nearly all YDEKC partner organizations.
  • Due to within-program heterogeneity of skills, program averages should not be used. However, all measures considered here have substantial negative skew or ceiling effects, which limits the usefulness of these measures for multiple time point designs.
  • We recommend a three-step method that addresses the integrated nature of skill learning as well as the use of youth skill measures that have lower construct validity and ceiling effects: (a) identify dimensionality in the data to best reflect the independent components of an individual’s integrated skill set, (b) use pattern-centered methods to identify independent profiles or subgroups of individuals defined by similar skill sets, and (c) collect the data at multiple time points for youth in the lower skill profiles at baseline.
  • We carried out a similar set of analyses using data from the YDEKC school survey, finding substantial positive evidence for the reliability and construct validity of these measures (see Appendix D).

Building Citywide Systems for Quality: A Guide and Case Studies for Afterschool Leaders

Summary

High-quality programming is essential in order for afterschool efforts to generate positive effects for youth. This guide can help those working to create better, more coordinated afterschool programming get started building a quality improvement system (QIS), or further develop existing efforts. A quality improvement system is an intentional effort to raise the quality of afterschool programming in an ongoing, organized fashion.

Components of an Effective Quality Improvement System 

Shared definition of quality: There should be general agreement on what constitutes a high-quality program.

Lead organization: Not having a clear leader can cause confusion. Lead organizations can be stand-alone technical assistance organizations, intermediaries, city agencies, funding entities, or policy/advocacy organizations.

Engaged stakeholders: a QIS is more likely to be effective, sustainable, and scalable if a defined group of organizations is supportive.

Continuous improvement model: Effective models typically include a standard for high-quality performance, an assessment tool, and aligned improvement supports such as planning, coaching, and training.

Information system(s): Quality improvement systems generate data. These data can include administrative audits, tracking of QIS participation and engagement, or data about program attendance and/or child outcomes. The QIS should capture and store such information.

Guidelines and incentives for participation: Guidelines brings coherence and relevance to the system.

Adequate resources: Funding is necessary for sustainability. 

System-building Stages and Tasks

The guide describes a series of tasks organized into three broad stages.

  1. Plan and Engage. Specific tasks include assessing readiness, forming a work group, making the case, engaging stakeholders, identifying a lead organization, defining quality, clarifying purpose, considering information needs, and determining costs and potential resources.
  2. Design and Build. This step takes the process from the conceptual to the practical. Specific tasks include designing the continuous improvement model that programs will experience, developing system-level supports for the model, recruiting pilot sites, and piloting the continuous improvement cycle.
  3. Adjust and Sustain. Specific tasks include refining the continuous improvement model and system supports, building capacity of the lead organization, engaging new programs and sectors, evaluating, and embedding and sustaining the system.

In addition to a QIS Capacity Self-Assessment Tool, the guide also includes case studies of efforts by six communities to build quality improvement systems.

Key Takeaways

  • Developing a quality improvement system for afterschool and youth development programs in a community is important, complex work.
  • Although quality improvement systems vary, mature, effective systems share some common components and characteristics. These include a shared definition of quality, engaged stakeholders, and adequate resources.
  • There are three stages for developing a quality improvement system: Plan and engage, design and build, and adjust and sustain. 

The STEM supplement to the Youth Program Quality Assessment

Introduction

curricula from the fields of environmental science, technology, engineering, and mathematics (“STEM”) at 10 sites(one partner delivered the same curriculum at two sites). The offerings were organized at 10 school-based Afterzone sites and each offering included field work in the local Providence region. Across the 10 sites, STEM curricula were delivered to a total of approximately 250 middle school students (about 25 students per Afterzone section).

In order to evaluate the Afterzone Summer Scholars model and collect information for future improvement, PASA (a) hired an external evaluator for the project; (b) committed to providing continuous improvement supports to participating program managers and content providers (quality assessment and coaching); and (c) formed an evaluation advisory board to monitor the development and implementation of the external evaluation. In addition, PASA contracted with the David P. Weikart Center for Youth Program Quality (Weikart Center) at the Forum for Youth Investment to develop an observation-based measure of instructional practices to support continuous improvement during STEM programming. This report describes the process of development of the STEM supplement to the Youth Program Quality Assessment (Youth PQA; HighScope, 2005) and preliminary reliability and validity evidence based on data collected during Afterzone Summer Scholars program.

Continuous Quality Improvement in Afterschool Settings: Impact Findings from the Youth Program Quality Intervention Study

Abstract

Background: Out-of-school time programs can have positive effects on young people’s development; however, programs do not always produce such effects. The quality of instructional practices is logically a key factor but quality improvement interventions must be understood within a multi-level framework including policy, organization, and point of service if they are to be both effective and scalable.

Purpose: To evaluate the effectiveness of the Youth Program Quality Intervention (YPQI), a data-driven continuous improvement model for afterschool systems. Research questions include:

  • Does the YPQI increase managers’ focus on instruction and the use of continuous improvement practices by site-based teams?
  • Does the YPQI improve the quality of afterschool instruction?
  • Does the YPQI increase staff tenure?
  • Can the YPQI be taken to scale across programs that vary widely in terms of structure, purposes, and funding and using resources available to public agencies and community-based organizations?
  • Will afterschool organizations implement the YPQI under lower stakes conditions where compliance with the model is focused on the improvement process rather than attainment of pre-determined quality ratings?

Participants: Eighty-seven afterschool sites in five diverse afterschool networks participated in the study. Each site employed the equivalent of one full-time program manager and between two and ten direct staff; had an average annual enrollment of 216 youth; and had an average daily attendance of 87 youth.

Research Design: This is a cluster randomized trial. Within each of the five networks, between 17 and 21 sites were randomly assigned to an intervention (N=43) or control group (N=44). Survey data were collected from managers, staff, and youth in all sites at baseline prior to randomization (spring 2006), at the end of the implementation year of the study (spring 2007) and again at the end of the follow-up year (spring 2008). External observers rated instructional practices at baseline and at the end of the implementation year. Implementation data were collected from both intervention and control groups. Hierarchical linear models were used to produce impact estimates.

Findings: The impacts of the YPQI on the central outcome variables were positive and statistically significant. The YPQI produced gains in continuous improvement practices with effect sizes of .98 for managers and .52 for staff. The YPQI improved the quality of staff instructional practices, with an effect size of .55. Higher implementation of continuous improvement practices was associated with higher levels of instructional quality, with effects nearly three times greater than the overall experimental impact. Level of implementation was sustained in intervention group sites in the follow-up year.

Conclusions: This study demonstrates that a sequence of continuous improvement practices implemented by a site-based team – standardized assessment of instruction, planning for improvement, coaching from a site manager, and training for specific instructional methods – improves the quality of instruction available to children and youth. The YPQI produces a cascade of positive effects beginning with the provision of standards, training, and technical assistance, flowing through managers and staff implementation of continuous improvement practices, and resulting in effects on staff instructional practices. Evidence also suggests that participation in the YPQI may increase the length of staff tenure and that YPQI impacts are both sustainable and scalable.

Quality at the Point of Service: Profiles of Practice in After-School Settings

Abstract

A unique observational data set was used to explore quality at the point of service in after-school programs. Staff practices in after-school settings were represented on a series of unidimensional scales closely indexed to staff behavior. In order to account for heterogeneity of staff performances, pattern-centered methods were used to construct profiles of common staff practices. Results revealed six pedagogy profiles that were classified in terms of three broad types of performances delivered by after-school staff: (1) positive youth development, (2) staff-centered, and (3) low-quality. Staff membership in these profiles was not related to youth-staff ratio. However, results revealed significant differences between the profiles on the content of the offering and the age of youth in the setting.

Linking after-school instructional practices to youth engagement: A pattern-centered approach

Youth participation in after-school settings has been linked to numerous positive outcomes but this is inconsistent across studies. Two necessary ingredients for this link are proposed: appropriate instructional practices and youth engagement. Both are profiled user cluster analysis in a dataset including observations of staff instructional practices in 151 youth program offerings and 1176 surveys from youth attending these offerings. Instructional practice profiles suggest patterns corresponding to positive youth development (PYD), staff-centered (SC), and low quality. Youth engagement profiles range from low to medium to high, and further vary across perceived learning and voice. Cross-tabulations reveal strong positive links between PYD and high engagement and between low-quality and low engagement; and strong negative links between PYD and low engagement and between low-quality and high engagement.