In youth development work, as in many social service arenas, there has been a historical tension between standardization and innovation. On one hand, both service providers and funders want to increase the reach, or scale, of successful programs so that as many participants can reap the benefits as possible. On the other hand, as programs are scaled and ever larger systems are created, individual service providers often lose the freedom to innovate and respond to the unique needs of their clients or communities.
In the City of Philadelphia, the Department of Human Services (DHS) and Public Health Management Corporation (PHMC) use standardization in a way that not only permits, but promotes, innovation, resolving the long standing tension in the field. By developing a shared vision of quality and a uniform set of outcomes, while at the same time offering a fundamentally flexible methodology, DHS and PHMC equip youth service providers to attain a high standard of quality while still responding to local needs and emerging best practices. Today, the programs that are part of the Out of School Time System funded by DHS (“the OST System” or “the System”) continue to reinvent themselves and their programming while still working as part of the larger community of the System.
A Unified System
In 1999, the City of Philadelphia created an early version of what has evolved to be the current OST System. Funded by city, state, and foundation dollars, the System offered a unified structure of financial support to many of the agencies that provided care to school-age young people during the afterschool and summer hours.
These programs varied widely, operating in a range of locations, including public schools, churches, parochial schools and community centers. And they were operated by providers that varied in size, from large social service agencies that offered a comprehensive range of services, to small, neighborhood-based non-profits with a narrower mission.
The OST providers in this System also varied in their approach to afterschool programming and the outcomes they hoped to achieve. For some, the primary mission of afterschool programming was prevention, keeping children safe during hours that they might otherwise be left unattended, while for others the mission was academic, offering homework assistance and reinforcing school day content. Other providers used afterschool programming to supplement or augment the school day, and offered music, art, sports, job training and other activities that may have been overlooked during an already crowded school day.
A Vision of Quality
The diversity in the System presented a tremendous challenge: how to promote a high standard of quality across the System, while preserving the agencies’ ability to innovate and respond to the needs of the neighborhoods they served. By uniting the System around a unified vision of program quality and purpose, the City hoped to begin addressing this tension between standards and innovation. And by addressing and resolving this tension, the City hoped to create a scalable, replicable model of OST programming.
In the year 2000, the City of Philadelphia, in partnership with a number of stakeholders, developed the Core Standards for Philadelphia’s Youth Development Programs (the Core Standards). This document identified baseline expectations for program quality in a number of key areas, including human relationships, and program activities and implementation. These areas of quality would also inform the quality assessment tools that would later be created to observe and measure program performance. Ultimately, the Core Standards (not to be confused with the Common Core Standards, in the school day context) were a first step toward a unified OST system.
Over time, the City sought to provide even more guidance to the OST System. Although the Core Standards provided a common language around and understanding of program quality, that was only one piece of a larger puzzle.
The Core Standards were originally designed to address all youth programs, and thus, did not explore quality in the specific context of the programs in the System. For example, engaging activities or classroom behavioral expectations, would look understandably different in a library-based tutoring program, than in a sports program housed at a Parks and Recreation facility, or a City-funded OST program located in a public school or community center.
More fundamentally, the OST System still sought guidance around the question of methodology. The City, having outlined a vision of quality to the System, now wanted to help programs explore best practices that would empower them to attain, and maintain, this level of quality.
A Unified Approach
In 2009, the City of Philadelphia, through its administrative entity, the Public Health Management Corporation (PHMC), introduced Project-Based Learning (PBL) to the OST System. Project-Based Learning is a teaching method that emphasizes inquiry and real world problem-solving. By rooting projects in students’ interests, PBL taps the natural curiosity of learners. And by teaching concepts through activities that emphasize hands on experiences, PBL breathes life into academic concepts beyond the pages of a textbook or the questions on a worksheet. PBL was chosen for its flexibility. Rather than providing a prescribed curriculum, Project-Based Learning provides a methodology, a way of thinking about OST programming that could inform each afterschool provider’s existing practices and activities. At once rigorous enough to promote high quality OST programming and flexible enough to meet the needs of a variety of agencies serving a diverse population, PBL helped resolve the challenge of preserving room for innovation in a standardized system.
Each PBL project starts with a Driving Question designed to pique curiosity and prompt inquiry. A Driving Question should be open-ended, because it will sustain inquiry over the course of a five- or six-week project. Project activities flow naturally from the Driving Question, as students explore the content areas and build the skills they will need to form an answer. At the end of a project, students prepare a Culminating Event or Culminating Product that allows them to showcase their work, and to literally answer the Driving Question. Ultimately, the appeal of Project-Based Learning lies in this open-endedness and youth-driven process. OST providers could create projects that asked any question they wanted, and could create whatever activities best explored that question and engaged their students.
Although the PBL model retained its flexibility when introduced to the OST System, it was still important to provide clarity and a standardized approach. Most importantly, the PBL model gave providers a structure for organizing what was once primarily a loose collection of activities. To that end, OST providers were expected to complete a minimum number of projects in a given year. Additionally, PHMC, with the help of the Buck Institute for Education, created a Project Planning Form, Task List, Rubric and Debriefing Form that would allow providers to thoroughly document the projects they completed.
Despite the standardization and more consistent documentation process, OST providers retained complete control over project content. Programs were encouraged to make efforts to solicit youth input and incorporate youth interest into the project design. Additionally, many programs worked with teachers to create projects that reinforced school day content.
In many ways, the introduction of PBL to the OST System has been a great success. In the five years since its introduction, hundreds of OST programs have completed thousands of projects on a variety of topics. Students have made movies, designed their own clothing lines, built solar-powered ovens and fundraised for and supported causes that were important to them.
Perhaps most importantly, the City of Philadelphia and PHMC, by introducing PBL to the OST System, created uniform, System-wide expectations about the purpose of OST programming. In the OST System, it was now clear that providers were expected to do more than merely offer students a safe space to go after school. Through PBL, OST providers could reinforce school day content, promote 21st Century Skills like critical thinking and collaboration, and offer planned, purposeful activities that were engaging, hands on, and fun. And this approach was scalable, positioning the City to advocate for an ever-larger OST system.
III. The Challenge: Ensuring Quality, While Preserving Flexibility
Although the introduction of PBL was generally successful, the OST System still faced some challenges. The appeal of PBL in an OST setting is its flexibility. The System’s diverse group of OST providers used projects to teach a variety of skills and content areas. So, although the City had provided clear expectations around program quality and provided a methodology that would help programs pursue and attain that level of quality, the impact on participants attending the programs was difficult to measure and report. The learning approach used in the OST programs was now unified, but what students were gaining by attending the programs still varied dramatically from agency to agency.
The OST providers were individually working to develop academic skills, life skills, job readiness skills, social-emotional competencies and positive community connections, not to mention increase engagement in school, graduation and promotion rates, college attendance, and a host of other outcomes. With providers working on so many issues at the same time, it became difficult to measure the impact of any given OST program, and also difficult to measure the impact of the system as a whole. The City and PHMC continued to take steps to rigorously assess, and subsequently support, program quality. However, it was important to view program quality as a means to achieving real, measurable impact on the lives of the youth who attended the programs, and the OST System still needed the tools to effectively capture and evaluate this impact.
Moreover, with such a diversity of outcomes, it was not only difficult to measure the impact of the OST System on Philadelphia’s youth; it was subsequently difficult to communicate this impact to stakeholders. Publicly-funded agencies have an obligation to maximize the value of their services and use the best possible interventions to achieve the greatest possible impact on the lives of the constituents they serve. And in a competitive funding environment, government agencies must also be able to capture and communicate this impact – essentially, to prove their worth.
As the City agency seeking to create a scalable, replicable model of OST programming in order to grow and thrive, DHS recognized that the system needed clearly stated outcomes to accompany the existing quality and methodological standards. Individual community service organizations would also benefit from this move toward centralized outcomes.
One of the benefits of any unified OST system is that government entities and their intermediaries have greater capacity to collect and analyze data, provide professional development, and communicate challenges and successes to decision-makers than many smaller agencies. And by articulating and evaluating outcomes system-wide, a unified OST system can also avoid needless duplication of efforts. A system that measures its own outcomes and advocates for itself can also relieve individual youth service agencies of some of the burden of continued fundraising. In short, measuring collective impact on a System-wide scale has been more efficient, more accurate and more beneficial to the providers in the OST System than an agency-by-agency approach.
IV. The Solution: Implementing Outcomes, While Adding Flexibility to Existing Methodologies
At its simplest, an outcome is “something that happens as a result of an activity or process.” Youth attend programming, and if the program works the way it should, some measurable impact on the life of that young person occurs. At both the agency level and the System level, the importance of outcome-driven OST programming was never in doubt. A clear set of outcomes gives purpose and direction to OST program planning and implementation, in addition to the other benefits stated previously.
In the OST System, individual OST programs had outcomes in mind – a mission, goal, or desired impact – long before the System-wide approach to outcomes was implemented. However, the extent to which programs reflected on, articulated, and measured these outcomes varied considerably, as did the outcomes themselves. Therefore, DHS committed itself to bringing outcomes to the forefront, and to creating a shared understanding of the goals and desired outcomes for the OST System that all providers could hold in common.
In the fall of 2012, DHS began convening representatives from the more than 70 provider agencies to discuss coordinated youth outcomes for the first time. DHS held Executive Director (ED) meetings, which brought together executive-level staff, often beyond just an agency’s ED, and Site Director meetings, which helped DHS gain a more “grass roots” perspective. On the whole, these meetings were well attended by a broad spectrum of providers. In addition to the provider forums, other key players, such as PHMC and the School District of Philadelphia, contributed to the conversation about outcomes.
The discussion focused on the common outcomes providers were already striving for, the priorities of DHS and its partners, and the existing literature on OST outcomes. Additionally, these meetings gave providers in the System a chance to share their thoughts about the feasibility of standardized outcomes in such a diverse System. For example, some OST programs (particularly those with federal 21st Century Community Learning Centers funding) were more comfortable with academic outcomes linked to grades and state standardized test scores than providers who had traditionally focused on the social-emotional or life skill aspects of youth development.
By the spring of 2013, DHS had identified three main goal areas for the project:
- Participating youth will have academic success
- Participating youth will develop positive life skills
- Participating youth will be ready for college and/or career
These goal areas were further articulated as concrete, measurable outcomes. To continue to provide flexibility to the System, while demanding uniformly high standards, providers could use a number of indicators to demonstrate success in any given outcome area. The six outcomes were:
- Increased Engagement in Learning, indicated by curiosity, persistence in learning, and the ability to think critically and problem solve.
- Increased Engagement in School, indicated by school day attendance, and participation in school programs and extracurricular opportunities.
- Aspires to Educational Excellence, indicated by strong academic performance, as reflected by grades, test scores, and other measures of academic excellence.
- Improved Life Skills, indicated by growth and development in 21st Century Skill areas like communication, collaboration and goal setting, as well as increased independence, responsibility and self-reliance.
- Improved Relationships, indicated by the presence of supportive, meaningful connections between adults and youth, as well as strong peer-to-peer relationships.
- Prepared for Higher Education and Employment, indicated by an increased awareness of college, career, and other post-secondary options.
Although DHS introduced a unified set of outcomes to the OST System, preserving flexibility remained a high priority. A flexible System with high standards of quality and expected outcomes promotes innovation, as providers within the System work to maximize the impact of their work. While OST providers generally recognized the need to measure System-level impact, some providers expressed concern that the ability to adapt to specific challenges faced by their agency and in their community would be lost with a standardized set of outcomes.
To balance the move toward standardization that System-wide outcomes represented, DHS simultaneously provided more flexibility by introducing new methodological approaches to the System. Together, these methodologies would be called the Structured Activities approach. Under this new Structured Activities approach, programs could now implement Project-Based Learning, Service Learning or Experiential Learning programming.
These methodologies shared a number of qualities in common. They were each project-based, in the sense that they each utilized longer-term projects (3-10 weeks, projects usually increase in length with older students) that required planning and purpose to implement. Additionally, they each offered opportunities for academic and 21st Century Skill building by incorporating a high degree of rigor, while still emphasizing hands-on, engaging activities.
Ultimately, the Structured Activities approach offered providers in the OST System a greater range of methodological approaches to achieve their desired outcomes. Some providers with a niche focus, who offered specific job training or a targeted, narrower range of program activities, found the Experiential Learning model more conducive to their approach to programming. Other providers, who emphasized community engagement and connection, used the Service Learning model to give greater purpose and intentionality to existing service projects and activities.
By introducing the Structured Activities approach in concert with clearly identified outcomes, DHS and PHMC continued to address the challenge of standardizing a system while offering service providers the freedom to innovate and explore new ideas.
Although identifying desired outcomes at the System-wide level was an important step for the OST System, this did not guarantee that a provider, or the System as a whole, would achieve them. An implementation plan, buttressed by a system of supports for this implementation, was needed to ensure success for the System, the programs, and ultimately the youth.
Metz, Goldsmith, and Arbreton (2008) identify six elements, what they call “guiding principles,” of program quality that are necessary to achieve positive youth outcomes: a focused and intentional strategy, exposure, supportive relationships, cultural competence, and continuous program improvement. Of these six elements, five had been formalized in the OST System prior to creating a shared set of youth outcomes.
The Core Standards provided guidelines for positive human relationships, family engagement, and cultural competence. Adherence to these guidelines was observed and measured through site visits completed by PHMC. Meanwhile, the Structured Activities approach provided the focused and intentional strategy to the OST System. Finally, the provider contract and Scope of Work, which set minimum standards for operating days and hours, and compensated providers based, in part, on students’ rates of attendance, ensured that youth received sufficient exposure to programming.
However, the sixth guiding principle, Continuous Quality Improvement (CQI), had been implemented in only an informal way. PHMC, in its role as administrative entity, had long provided professional development and coaching to providers in the OST System. However, with the introduction of shared, System-wide outcomes, it was now possible to codify this informal system of supports into a standardized process.
Supporting Outcomes with Continuous Quality Improvement
Many scholars have concluded that CQI is an essential domain of program quality, including Palmer, Anderson, and Sabatelli (2009) in their review of eight different program quality frameworks. Youth outcomes cannot be achieved without quality programming, and quality programming requires a regular, systematic process of reflection, goal setting, and program enhancements. To be truly effective, DHS and PHMC needed to systematize the support that was already offered informally.
After discussion and consultation with experts at the National Institute on Out-of-School Time at Wellesley College, the OST System adopted the NIOST Continuous Quality Improvement (CQI) cycle as a framework for program support. The NIOST CQI model provided a useful framework, and DHS and PHMC focused this model by identifying or creating the specific tools, timeline, data collection protocols, and process that would be needed to implement it in the OST System that now consisted of 200 programs.
At the beginning of the 2013-2014 school year, OST providers completed a self-assessment of quality, which informed a provider-driven process of goal-setting, supported by PHMC Program Specialists. When preparing for the self-assessment and goal-setting process, DHS and PHMC recognized that any effort to measure and improve program quality would require reliable tools for evaluation. Additionally, OST providers and PHMC Program Specialists would need to be trained to use these tools effectively.
Most providers in the OST System used a self-assessment tool, recently developed by the Pennsylvania Statewide Afterschool Youth Development Network (PSAYDN), to help youth programs reflect on their program quality. Although not in wide use in Philadelphia at the time, the PSAYDN tool has been used elsewhere in the state and measures areas of program quality that are found in the Core Standards, and are familiar to providers in the OST System. However, since the tool was new to most providers in the system, PHMC provided a series of three in-person trainings on the PSAYDN Program Quality Self-Assessment, as well as an online training that providers could take or share with their teams.
Those providers not using the PSAYDN assessment participated in a citywide capacity-building initiative, funded by a three-year grant from the Wallace Foundation. As a part of this initiative, providers used the Assessment of Program Practices Tool (APT-O), a self-assessment of program quality, developed by NIOST. These providers attended a two-day workshop about the APT-O tool, and its implementation, and received continued support from DHS, PHMC and NIOST. Ultimately, DHS and PHMC made clear to providers that the specific tools used (for self-assessment, for external observation, and for capturing youth change) were less important than the larger commitment to a Continuous Quality Improvement process that would necessarily begin with both an internal and external assessment of quality.
Once the self-assessments were completed, PHMC Program Specialists met with each program individually to review the self-assessment results, explore the program’s strengths and challenges in-depth, and to set goals. For each goal, providers and Program Specialists also identified concrete action steps that would be implemented over the course of the year. Although the goal-setting process was time consuming, it was essential to develop buy-in from site-level staff, further engaging the System in its shared outcomes, and in the CQI process. Additionally, this one-on-one process of goal-setting and support allowed Program Specialists to address each agency’s specific challenges, and preserved the flexibility within the larger, standardized OST System.
In the winter and spring, PHMC’s Program Specialists conducted their own site visit assessments. In the past these external observations were the only measure of quality used by the OST system. In this new CQI model, external observations would be considered together with internal, self-assessment data to paint a more complete picture of the quality of a particular program. To complete these observations, PHMC used its own Site Visit Assessment Tool, which has been in use, in various forms, since 2008. The Site Visit Assessment Tool had been previously aligned with the Core Standards and the universal project-based learning approach, but for the 2013-2014 school year, a few additional modifications were made to ensure consistency with the language of the youth outcomes and to reflect broader structured activity options to include Service-Learning and Experiential Learning, in addition to Project-Based Learning.
Throughout the year, data on program activities, youth participation, and youth skill development were collected and analyzed to inform both individual program quality improvements and provide updates on the status of the entire system. For example, youth attendance in the form of arrival and departure times, were entered into a web-based system called PCAPS. This information, which was already being collected for billing purposes, could now be used in conjunction with observer and self-assessment data to see if youth were attending consistently enough to make outcome attainment probable.
Outcomes attainment was measured not only at the level of program quality, but also by evaluating the skill demonstration of individual students. To do this, all providers used the 21st Century Skills Rubric, a youth evaluation tool based on a PBL rubric originally developed by PHMC and the Buck Institute for Education. The 21st Century Skills Rubric provides a framework for evaluating individual youth across the following five competency areas: Goal-Setting; Collaboration; Critical Thinking and Problem Solving; Communication; and Active Learning and Engagement. The providers in the Wallace-funded initiative also completed the Survey of Academic and Youth Outcomes (SAYO).
Ultimately, through the CQI process, OST providers in the OST System completed an internal evaluation of program quality (the PSAYDN self-assessment or APT-O self-assessment), an internal evaluation of student performance and outcome attainment (the 21st Century Skill Rubric and the SAYO-S tool), a self-assessment review and goal-setting meeting, as well as an external evaluation of program quality (the PHMC Site Visit Assessment). All of this information, coupled with attendance information and other data, provided a robust, nuanced picture of program performance on which goals could be based.
Supporting the Continuous Quality Improvement (CQI) process
As the 2013-2014 school year approached, this new emphasis on outcomes and quality raised concerns from the provider community on the availability of coaching and other supports. OST providers participated in the outcomes identification process with enthusiasm, but also emphasized the importance of clear expectations around outcomes attainment, and a robust system of supports for providers once the outcomes had been identified and implemented. OST providers had a number of questions: Was each program required to achieve all six outcomes? Would failure to meet DHS-developed benchmarks affect funding in the current year or in subsequent years? Were providers expected to implement this outcome-focused approach to programming immediately? What support, professional development and coaching would providers receive to achieve these outcomes?
To answer these questions and support those providers who were unfamiliar, or even uncomfortable, with the self-assessment and CQI process, DHS and PHMC offered a wide range of supports. PHMC delivered a series of workshops about the CQI process. As previously mentioned, a two-day workshop was specifically designed to expose programs in the Wallace-funded initiative to the SAYO and APT-O tools.
In October 2013, following the PHMC-hosted trainings and initial self-assessment submission, Program Specialists began meeting with providers individually. For some providers, the process of self-assessment was new and occasionally uncomfortable. The Program Specialists were especially helpful and essential to the success of CQI implementation. Program Specialists used their relationships with program staff to build insight, help prioritize goals, create action steps, and ultimately, focus staff on the big picture goal: youth success.
Program Specialists continued to provide support after a site had developed their goals and action steps. PHMC and other DHS professional development partners offered workshops and other professional development sessions that addressed common goal areas, such as recruitment and retention, designing strong projects, improving literacy, college and career awareness, and classroom management. For unique program goals, Program Specialists provided individual, targeted coaching. OST Providers were also encouraged to take on program improvements independently whenever possible, to build their own capacity. By late spring, it had become clear that a large number of programs had made meaningful programmatic changes that have and will continue to better impact the lives of youth.
The extent of the collective youth impact is still being measured. DHS and PHMC began collecting the 21st Century Skills Rubrics for individual youth in the winter of 2014, but several more cycles are necessary before a true baseline can be established or progress can be determined using that instrument.
While DHS and PHMC continue to collect and analyze data based on Program Specialists’ observations and provider self-assessments, it is already clear that DHS and PHMC have made tremendous strides in creating a data-driven, outcome-focused System built on best practices and focused on youth impact. What was once a loose assortment of youth development programs has now been aligned under a shared vision of quality, a shared methodology, a shared set of outcomes, and with substantial support to improve program quality and build agency capacity. All of these components have been provided in a way that not only improves quality and raises standards, but continues to foster creativity and innovation, rather than stifle it.
The OST System has incorporated innovation as an essential principle of its program design. This System is scalable, and can grow to meet the needs of a City that has chosen to invest significant resources in youth development, while still ensuring a high standard of quality. DHS has provided a comprehensive latticework of provider supports and System partners, all devoted to equipping OST providers to meet the high demands of an outcomes-based approach. Ultimately, by leveraging the size and reach of the OST System, DHS and PHMC have not only demonstrated that a standardized, scalable OST system can preserve innovation, but that such a system can only happen with all of the resources of a unified system.