What Exactly is Compassionate Evaluation?

Compassion has a lot of definitions, but most have to do with recognition of suffering, action to alleviate suffering, and tolerance of discomfort during the action.[i] By April of 2020, we knew that our afterschool partners in Genesee County (including the city of Flint) Michigan, and many of the children and families that they served, were suffering. A significantly higher proportion than usual of those families were in a crisis-mode. For afterschool educators, the learning environment had moved, and the means of delivering programs had changed dramatically. A “pivot” was required.

When our partners told us how evaluation could help, they emphasized a compassionate approach to the work that would address suffering in multiple ways: by reducing workload related to evaluation, by providing an evaluation design that was of timely value in the current moment of challenge, and by wherever possible reflecting back to staff their own incredible commitment and ingenuity in meeting those challenges.

We translated this desire for an experience of compassion into a few rules about method:

Rule 1 was make it quick. We knew that staff were in crisis mode and that time was precious. We eliminated all data collection responsibilities for staff. Staff had only to schedule dates for observers, sit for a 45-minute interview, review the report during a 70-minute training, and then review the report again during a subsequent 15 minute portion of an all-staff meeting. This meant less than 4-hours of total time for a site coordinator engaging in required evaluation activities between September and December 2021.

Rule 2 was prioritize local expertise. When program practices and objectives are changing rapidly (the pivot mentioned above), prior evaluation designs (including measures) are of reduced validity. This was true simply because it was no longer the same service – models varied widely both within and across programs.[ii] We asked open-ended questions about what was and wasn’t working and then coded text segments to existing program standards for program fidelity, instructional quality, and students’ socio-emotional skills. In this way, we identified standards that were applicable in the new situation, named new priorities in those terms, and respected site managers as expert sources of data about what works.

Rule 3 was ask about what is changing (and reflect strengths). Afterschool staff told us they felt like their professional tools became outdated overnight, by the pandemic, and it was not a good feeling. We spent our moments of access to leaders and site managers asking how it was going, letting them give voice to however it was going by taking the conversation wherever it went, and then by intentionally reflecting strengths back to them. Although this therapeutic aspect of our service may feel a bit uncomfortable to some evaluators, the situation required it as an aspect of method. As evaluators, we were “giving value to” leaders’ and site managers’ experiences by letting them flow some ideas and emotion while answering our questions.

Rule 4 was write it down. By asking staff about practices and coding their transcribed responses into categories, we were identifying sentences written by program staff that describe specific local best practices in specific local terms. By identifying and writing down local best practices in the words of program staff, the evaluator helps speed up the development of shared mental models about what the pivoted service is. This helps service providers demonstrate accountability in the sense of “this is actually what we did every day.” It also makes it possible for leaders to pivot the service more easily in the future by returning to documentation for crisis- or emergency-management.


[i] Strauss, C., Lever Taylor, B., Gu, J., Kuyken, W., Baer, R., Jones, F., & Cavanagh, K. (2016). What is compassion and how can we measure it? A review of definitions and measures. Clinical Psychology Review, 47, 15–27. https://doi-org.proxy.lib.umich.edu/10.1016/j.cpr.2016.05.004

[ii] Roy, L. & Smith, C. (2021). YouthQuest Interim Evaluation Report. [Grantee Evaluation Report]. QTurn. and Smith, C. & Roy, L. (2021). Best Practices for Afterschool Learning at a Distance: GISD Bridges to Success. [Grantee Evaluation Report]. QTurn.

Continuous Quality Improvement and Evaluation in 2020: A Plan for 21st Century Community Learning Centers

During times of crisis when programs are under tremendous pressures, evaluation and assessment can be challenging. Programs enter triage mode, putting their limited time and energy into the most urgent tasks. This heightens the need for evaluation that reduces strain and improves capacity. When conditions that created the crisis are long-lasting, like the coronavirus pandemic, it becomes necessary to revisit and restore vital activities that may have been moved to the backburner, but to do this successfully often requires intelligent redesign. How can the same needs be met in a new way? How can evaluation and assessment be adapted to succeed in challenging conditions?

QTurn has developed a comprehensive evaluation plan for afterschool programs at a moment when redesign of the service and delivery of the redesigned service are happening at the same time. This plan, ”Afterschool Evaluation Plan 2020,” was developed to address the unique needs of programs in the 2020-2021 school year – and support compliance with the specific requirements for the 21st Century Community Learning Centers. An evaluation plan for 2020 must remove burdens rather than adding to them and make life easier not harder. Aware of these needs, QTurn’s design includes short, validated assessment tools, guidance available through online trainings, and a reassuring, therapeutic approach referred to as the “lower stakes” model.

A Lower Stakes Approach

The lower stakes model is a strong component of QTurn’s work. Lower stakes means that the results of assessments are used to support and inform, and program staff are able to interpret the meaning of their own individual and group performance data. In a lower stakes model, the results of assessments do not influence funding or prompt sanctions. Instead, low assessments scores are opportunities for mutual learning, support, and growth. 

While this approach has been integral to the work of QTurn’s founder, Charles Smith, for decades, lower stakes is especially critical in the 2020-2021 school year as program staff strive to adapt to a new normal. The AEP 2020 is intentionally designed to alleviate stress and confusion and help staff adapt to rapid change and achieve shared meaning.

User-Friendly Assessment Tools

The Afterschool Evaluation Plan 2020 includes three assessment tools designed for remote or in-person programming (or a combination of both). The first measures fidelity to best practices at the management level. The second captures quality at the point-of-service (and applies to home learning environments). The third charts the growth of social and emotional learning (SEL) skills among youth. The tools are short and easy to use. Designed to work together as part of the cycle, they also support impact evaluation.

Management Practices Self Assessment (MPSA).  The MPSA was developed with extensive input from 21st CCLC program project directors and aligns with the core requirements for 21st CCLC programs in Michigan. With 24 indicators forming eight standards in four domains, the tool requires less than two hours for program managers to complete.

Guidance for Out-of-School Time Learning at a Distance (GOLD) Self-Assessment. Site managers and staff complete a self-assessment produced with extensive input from other expert  21st CCLC site managers. GOLD contains 27 indicators that form 11 standards in four domains. The four domains represent point-of-service quality in the individual learning environment.

Adult Rating of Youth Behavior (ARYB).  Each child is rated on the ARYB in November and April. By completing the assessment at two time points (earlier in the school year then again toward the end of the year), the ARYB is able to capture growth in social and emotional skills across the school year. The ARYB has 30 items that form six skill domains, including emotion management, teamwork, and responsibility.

Additional Features:

Cost-Effective.  The Afterschool Evaluation Plan (AEP) 2020 can be adapted to a wide range of cost structures. The Guidebooks and scoring forms are available for free to all users. Additionally, the demands on time are low. Completing the assessments can require as little as 2 hours for program directors and 3 hours for site managers and staff per year.

Guidance Through Online Training.  QTurn will offer live online trainings covering the use of the MPSA, GOLD, and ARYB. Support also includes online training that equip leaders and staff to do data informed planning.

Emphasis on School-Day Alignment.  QTurn’s AEP 2020 helps programs pivot toward greater integration with schools during a time when school has become more challenging for many children.

Support for Impact Evaluation.  Finally, data obtained using the assessment tools can be used to evaluate the overall impact of programs, particularly across multiple programs.

To adopt the AEP 2020, begin by downloading the assessment tools and resources.

For support with implementation of the AEP 2020, please contact the QTurn Team.