Accepted for Publication in Health Promotion Practice on 5/9/2011. Developmental evaluation: Building innovations in complex environments

Size: px
Start display at page:

Download "Accepted for Publication in Health Promotion Practice on 5/9/2011. Developmental evaluation: Building innovations in complex environments"

Transcription

1 Accepted for Publication in Health Promotion Practice on 5/9/2011 Developmental evaluation: Building innovations in complex environments RUNNING HEAD: DEVELOPMENTAL EVALUATION EVALUATION AND PRACTICE Michael C. Fagen, PhD, MPH 1, Sarah Davis Redman, MPAff, Jonathan Stacks, MSW, Vivian Barrett, Ben Thullen, MSW, Sunyata Altenor, and Brad Neiger, PhD, CHES 1 Corresponding Author University of Illinois at Chicago, School of Public Health, Department of Community Health Sciences 1603 W. Taylor St., M/C 923 Chicago, IL Tel: Fax: mfagen1@uic.edu Keywords Evaluation Design (HPP); Sexual Health (HPP); School Health (HPP); Health Education (HPP) Developmental Evaluation; Advocacy Evaluation Word Count (excluding references, tables, and figures): 2282 Funding Acknowledgement: The evaluation of the Illinois Caucus for Adolescent Health's School Board Sexuality Education Policy Change Project was funded by the Ford Foundation (Grant Number ). ************** Do not distribute without authors express consent **************

2 Developmental evaluation: Building innovations in complex environments INTRODUCTION Health promotion practitioners typically use two broad approaches to program evaluation: formative and summative. In formative evaluation, program design and delivery are improved. In summative evaluation, program outcomes are measured (Rossi, Lipsey, & Freeman, 2004). Ideally, these two evaluative approaches are connected in a cyclical manner that inspires constant program refinement and improvement, with the ultimate goal of promoting health. In practice, summative evaluations are often conducted in isolation from formative evaluations. Such summative evaluations judge programs on their merit and worth. When results do not show sufficient outcomes achievement, funding and personnel cuts can ensue. Premature summative evaluations may suppress innovative health promotion approaches. Innovation requires program strategies that are unique, flexible, and responsive to complex environments. While stakeholders will likely want some form of summative evaluation, quickly moving to outcomes-oriented assessment can be a poor fit for innovative programs. Health promotion practice needs an evaluation approach that nurtures innovative program development, which is where developmental evaluation becomes relevant. Developmental evaluation is a complementary approach to traditional forms of evaluation where measuring program outcomes is conducted only after complex programmatic and contextual issues have been addressed. As such, developmental evaluation may be particularly relevant when used as a pre-formative evaluation approach. This article starts by describing the emerging developmental evaluation approach. It then provides a developmental evaluation example from an effort to change school board policy on 1

3 sexuality education. The article finishes by discussing when and how developmental evaluation might be useful for health promotion practitioners. DEVELOPMENTAL EVALUATION Michael Quinn Patton (1994) first defined the term developmental evaluation over fifteen years ago as: evaluation processes and activities that support program, project, product, personnel and/or organizational development (usually the latter). The evaluator is part of a team whose members collaborate to conceptualize, design, and test new approaches in a long-term, on-going process of continuous improvement, adaptation, and intentional change. The evaluator s primary function in the team is to elucidate team discussions with evaluative data and logic, and to facilitate data-based decision-making in the developmental process. Patton continued to evolve his developmental evaluation framework in a subsequent article (2006) and recent book (2011), where he argues that developmental evaluation is particularly well suited for five purposes: 1. Ongoing Development: adapting an existing program to changing conditions. 2. Adaptation: adapting a program based on general principles for a particular context. 3. Rapid Response: adapting a program to respond quickly in a crisis. 4. Pre-formative Development: readying a potentially promising program for the traditional formative and summative evaluation cycle. 5. Systems Change: providing feedback on broad systems change. Patton s five purposes emphasize two foci of developmental evaluation: adaptation and change. Health promotion practitioners understand the need to adapt programs based on changing policies, funder requirements, organizational routines, and a host of other conditions. These changing conditions create complex environments in which linear program approaches that proceed logically from inputs through processes to outcomes can be poor fits. Corresponding assessment approaches that move lock step through the traditional evaluation cycle and largely 2

4 treat context as noise to be controlled or ignored can be equally poor fits. Developmental evaluation s emphasis on the evaluator s role in helping organizations make data-based decisions in the face of changing environments attempts to increase the fit between programs and their evaluations. As an emergent approach, developmental evaluation is still being defined and elucidated. Table 1 seeks to further clarify developmental evaluation by comparing it to traditional program evaluation approaches. While this comparison is necessarily reductionist and thus oversimplifies both evaluation approaches, it does point to some important differences between the two. One of these differences is the evaluator s role. A traditional evaluation may seek an external evaluator seen as independent from the program, whereas a developmental evaluation typically seeks a critical friend who will engage in ongoing evaluation discussions with program staff and organizational leadership. While both approaches require the evaluator to have strong methodological skills, the developmental evaluator uses these skills to drive the program s innovative growth (not its ultimate value judgment), and thus must be flexible in the face of changing organizational and environmental conditions. These and other unique features of developmental evaluation are explored in the following case example. Insert Table 1 here. USING DEVELOPMENTAL EVALUATION TO HELP CHANGE SCHOOL BOARD SEXUALITY EDUCATION POLICY Overview The Illinois Caucus for Adolescent Health (ICAH) is a thirty-four year old nonprofit organization whose mission includes promoting positive approaches to adolescent sexual health. ICAH has used a number of strategies to fulfill this mission, such as teacher training, youth 3

5 leadership development, and community organizing. Recently, ICAH identified changing school board sexuality education policy as a promising strategy for adolescent sexual health promotion (Fagen, Stacks, & Fischman, 2007). This strategy was developed and used by ICAH staff in concert with youth activists, who successfully advocated for the adoption of a comprehensive sexuality education policy by the Chicago Public Schools (CPS). CPS 2006 policy mandates (1) the provision of medically accurate and developmentally appropriate family life education in grades 5-12 and (2) corresponding teacher training (Fagen et al., 2010). From ICAH s perspective, such policy change creates a system of training, implementation, and accountability that will ultimately embed sexuality education into the school district s standard practice. This is an innovative approach to adolescent sexual health promotion, which has traditionally focused on developing and testing prevention programs with little or no attention to related policies (Card and Benner, 2008). Based on the Chicago experience, ICAH sought and received funding from the Ford Foundation to use its school board policy (SBP) change strategy in multiple Illinois school districts. Funded in two phases (Phase 1: , Phase 2: ), the SBP project s primary aim is to develop and refine a model for changing school board sexuality education policy that can be used by other organizations in different settings. Recognizing the key role that evaluation would play in developing this model, ICAH contracted with the lead author and two graduate students as its SBP evaluators. This choice reflects one of the key features of developmental evaluation: the evaluator must be credible to the organization and program staff. As a recent member of ICAH s board of directors who had worked closely with the organization s executive director and staff, the lead author possessed such credibility. From the outset, the evaluators took a developmental approach to SBP s evaluation. 4

6 While the Chicago experience was encouraging, the evaluators recognized that other sites would present very different environments for policy change with unique opportunities and challenges. Thus, the evaluators mindset was always one of innovation and learning: understand what worked in Chicago, apply it in other sites, and use evaluative data to continuously refine program strategies. Moreover, the evaluators considered the SBP program and its evaluation to be inextricably intertwined. Thus, the evaluation team has always been comprised of both the evaluators and SBP staff. Phase 1 Evaluation As a first evaluative step, the evaluation team created a logic model reflecting the complex set of processes, strategies, and interactions that spurred the CPS policy change (see Figure 1). This model was not pre-specified; rather, it was based squarely on the staff and youth activists Chicago experiences. Next, the team designed a Phase 1 evaluation that aimed to learn from and assess the model s application in five additional school districts. Three overarching questions guided the evaluation: A. Learning: what is being learned from the SBP efforts? B. Improving: how can the SBP efforts be improved, both during the current phase and beyond? C. Sharing: what does ICAH want to be capable of telling the sexuality education field about its SBP approaches? Insert Figure 1 here. Using these questions as a guiding framework, the evaluation team then posed a more specific set of questions designed to assess progress and change in each of the new SBP sites: 1. Core Group: how is the core group established, maintained, trained, and sustained? 2. Strategy Impact: what impact do various strategies have on policy change? 5

7 3. Policy Change: to what extent does policy actually change? From ICAH s perspective, having a core group of site-based allies advocating for school board policy change was central to the SBP model. Thus, a majority of the evaluation techniques were focused on these core groups, including assessment surveys, training evaluations, quarterly reports, and focus groups. In addition, the evaluators sought to conduct key informant interviews with school board and community-based stakeholders in order to understand the context for core group efforts. Finally, the team held quarterly evaluation review meetings in order to discuss, synthesize, and act on these multiple data sources. As the Phase 1 evaluation progressed, it became clear that the core group was functioning very differently across sites. In some sites these groups formed easily and needed little support from SBP staff. In other sites the groups were difficult to launch and were challenged to meet regularly. By the end of Phase 1, only one of five sites had successfully changed its district s sexuality education policy. While ICAH was disappointed by this result, in retrospect it was not surprising given the slow nature of advocacy-based approaches to policy change (Fagen et al., 2009). Moreover, from a developmental evaluation perspective, the team had learned a great deal about which policy change strategies worked better than others. Specific findings aligned with the evaluation questions included: Core Group: since several core groups were difficult to launch and maintain, ICAH felt its efforts should be more broadly directed toward sites in all of their stakeholder diversity and complexity (and not narrow groups of pre-organized people within them). Strategy Impact: the most promising strategies involved site-based youth and stakeholders who weren t necessarily part of core groups or didn t want to meet in them regularly. Policy Change: given the low level of policy change, ICAH recognized the need to (a) assess a site s readiness for undertaking policy change and (b) start site-specific efforts in site-specific readiness stages. The team was ready to apply these lessons learned in the Phase 2 evaluation. 6

8 Phase 2 Evaluation Based on the Phase 1 findings, the team started Phase 2 by creating a revised logic model (available upon request). The new model was considerably less focused on core groups, included a site readiness assessment component that served to reduce the number of sites from five to three, and depicted site-based change efforts in three overlapping phases: (1) ready for policy change, (2) ready for health program planning, and (3) ready for implementation monitoring and evaluation. These phases reflected the variation in change strategies being used by different sites: some were still working directly on policy change, while others were planning district-wide health programs (that included sexuality education) or had selected programs to implement (as one way to build support for policy change). In other words, ICAH acknowledged that there were multiple pathways toward policy change, and that the SBP project s ultimate goal was to increase access to high quality sexuality education. No matter what phase a site might be in, ICAH recognized the importance of identifying a local community liaison, training local youth to promote sexuality education, and providing professional development for school- and community-based personnel (all key components in the Phase 2 logic model). In essence, the developmental evaluation guided ICAH to substitute this latter set of strategies for the core group strategies so central to Phase 1. In addition to revising the SBP model, the team revised its evaluation questions and methods at the start of Phase 2. While it continued to ask the strategy impact and policy change questions, it replaced the core group question with three phase-specific how questions that map the revised model. For example, one of these questions is How are the ready for policy change activities achieving this phase s goal of changing policy at the school board level? Methodologically, the team replaced Phase 1 s core group assessment surveys and focus groups 7

9 with evaluator-conducted (a) site visits (consisting of observations, document reviews, and key informant interviews) and (b) SBP staff debriefings. These new methods allow the evaluators to paint a data-based picture of what is happening at each site, share these data regularly as an evaluation team (which includes SBP staff), synthesize learning across sites, and make real-time improvements to the SBP project s strategies. This emphasis on continuous learning and evaluation use reflect several core principles in developmental evaluation. CONCLUSION Utility of a Developmental Evaluation Approach for the School Board Policy Change Project While data collection and findings generation continue during the SBP project s Phase 2, the team has already recognized several benefits from its use of a developmental evaluation approach. The ability to adapt strategies and corresponding evaluation methods from Phase 1 to Phase 2 would not have been possible with a traditional evaluation approach. Instead, the team likely would have been bound to a static program and pre-specified data collection methods. In addition, the tight integration between evaluators and SBP staff on a single evaluation team promoted data sharing, discussion, and learning that continue to be used for SBP strategy adjustment. Potential Uses of Developmental Evaluation in Health Promotion Practice Developmental evaluation fills a particular niche in health promotion practice. As the case example illustrates, it is not meant to replace more traditional forms of formative or summative evaluation. Rather, developmental evaluation might be an appropriate approach when innovations are in early stages, when environments are changing (or are particularly complex), when organizational learning is emphasized, or when systems (not individuals) are the change 8

10 target. Indeed, a developmental evaluation might be a natural precursor to a cycle of formative and summative evaluation ( Pre-formative Development ). The hope is that this introduction to developmental evaluation will catalyze its use when this approach makes sense in a particular health promotion practice setting. 9

11 REFERENCES Card, J. J., & Benner, T. A. (2008). Model programs for adolescent sexual health. New York: Springer. Fagen, M. C., Stacks, J. S., & Fischman, T. (2007). Practice notes: Strategies in health education. The Illinois Campaign for Responsible Sex Education. Health Education & Behavior, 34(4), Fagen, M. C., Reed, E., Kaye, J. W., & Jack, L. (2009). Advocacy evaluation: What it is and where to find out more about it. Health Promotion Practice, 10(4), Fagen, M. C., Stacks, J. S., Hutter, E., & Syster, L. (2010). Promoting implementation of a school district sexual health education policy through an academic community partnership Public Health Reports, 125(2), Patton, M. Q. (1994). Developmental evaluation. American Journal of Evaluation, 15(3), Patton, M. Q. (2006). Evaluation for the way we work. Nonprofit Quarterly, 13(1), Patton, M. Q. (2011). Developmental evaluation: applying complexity concepts to enhance innovation and use. New York: The Guilford Press. Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed.). Thousand Oaks, CA: Sage. 10

12 TABLE 1 A Comparison of Traditional and Developmental Evaluation Approaches Evaluation Component Traditional Evaluation Developmental Evaluation Purpose Model validation, accountability Development, adaptation Situation Stable, goal-oriented, predictable Complex, dynamic, changing Mindset Effectiveness, impact, compliance Innovation, learning Target Program participants Participants environment Measurement Based on pre-determined indicators Based on emergent indicators Unexpected Consequences Paid token attention Paid serious attention Evaluation Design By evaluator Collaborative with program staff Evaluation Methods Based on social science criteria Based on evaluation use criteria Evaluation Results (Ideal) Best practices Best principles Evaluator Role Independent from program Integrated with program Evaluator Qualities Strong methodological skills, credibility with external authorities and funders Strong methodological skills, credibility with organizational and program staff Adapted from Patton, M. Q. (2011). Developmental evaluation: applying complexity concepts to enhance innovation and use. New York: The Guilford Press.

13 Figure 1. Phase 1 logic model for school board sexuality education policy change ICAH Support Establish core group and campaign $5,000 sub-grant per year Select, stipend, and supervise college intern as community organizer Networking opportunities with other school board organizing sites Skills-based advocacy training monthly Document progress of campaign Provide technical assistance at every step Organizing the Core Group Build coalition of individuals and/or organizations committed to the long-term campaign and willing to be present throughout the process Consider representation by diverse populations and influential voices Develop a vision for ultimate policy change Develop a strategy map to reach policy change and guide the group Identify ways people can join the campaign at varying levels of involvement Establishing Messages and Recommendations Conduct necessary research Develop clear, effective recommendations Develop consistent messages Make adjustments as campaign progresses Approaching the System If the first approach yields agreement and a commitment by system rep. to move forward, the policy can be shaped If the system does not commit to moving forward, find out what would be necessary and go back to messages and community before re-approaching If system is never responsive, take directly to the school board through public action Organizing the Community Get secondary targets on board, possibly before first approach Demonstrate community support as necessary for approaching the system, getting on the agenda, and/or passing the policy Possible strategies may include petitions, town hall meetings, surveys, media outreach, rallies, canvassing, and more Part of this strategy could include organizing the business and funding communities to provide resources for implementation of the new policy Shaping the Policy Writing a policy to be considered by school board May happen in partnership with system or using public records of school board policy format Getting on School Board Agenda System rep. may assist in this process This may need to be forced through appropriate demonstration of community support Policy Change Happens In preparation, gather all community support, prepare testimony, and secure positive votes by board members Media presence will help illuminate the victory