National Implementation Science Network (NIRN)

Size: px
Start display at page:

Download "National Implementation Science Network (NIRN)"

Transcription

1 Implementation Drivers: Assessing Best Practices Implementation Drivers: Assessing Best Practices National Implementation Science Network (NIRN) Frank Porter Graham Child Development Institute UNIVERSITY OF NORTH CAROLINA CHAPEL HILL 2015 Allison Metz Dean L. Fixsen and Karen A. Blase, National Implementation Research Network v. 4/2013 Page 1

2 Implementation Drivers: Assessing Best Practices Citation and Copyright This document is based on the work of the National Implementation Research Network (NIRN). Suggested Citation Fixsen, D.L., Blase, K., Naoom, S., Metz, A., Louison, L., & Ward, C. (2015). Implementation Drivers: Assessing Best Practices. Chapel Hill, NC: National Implementation Research Network, University of North Carolina at Chapel Hill Allison Metz This content is licensed under Creative Commons license CC BY NC ND, Attribution NonCommercial NoDerivs. You are free to share, copy, distribute and transmit the work under the following conditions: Attribution You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work); Noncommercial You may not use this work for commercial purposes; No Derivative Works You may not alter, transform, or build upon this work. Any of the above conditions can be waived if you get permission from the copyright holder. We ask that you let us know how you use these items so we can use your experience and data to improve and expand the survey. Please respond to Allison Metz (contact information below). Thank you. Allison Metz, Ph.D. Director, National Implementation Research Network CB 8040 University of North Carolina at Chapel Hill Chapel Hill, NC About NIRN The mission of the National Implementation Research Network (NIRN) is to contribute to the best s and science of implementation, organization change, and system reinvention to improve outcomes across the spectrum of human services. nirn@unc.edu web: Allison Metz 2

3 Implementation Drivers: Assessing Best Practices Implementation Drivers are the key components of capacity and the functional infrastructure supports that enable a program s success. The three categories of Implementation Drivers are Competency, Organization, and Leadership. Background (Why?) Synthesis of the implementation evaluation literature identified multiple theoretical frameworks, which then engendered a need for implementation component measures to assess implementation progress and to test hypothesized relationships among the components. Moreover, reliable and valid measures of implementation components are essential for planning effective implementation supports, assessing progress toward implementation capacity, and conducting rigorous research on implementation. Policy,, and science related to implementation can be advanced more rapidly with practical ways to assess implementation. Since the beginnings of implementation science, the difficulties inherent in implementation have "discouraged detailed study of the process of implementation. The problems of implementation are overwhelmingly complex and scholars have frequently been deterred by methodological considerations.... a comprehensive analysis of implementation requires that attention be given to multiple actions over an extended period of time" (Van Meter & Van Horn, 1975, p ; see a similar discussion nearly three decades later by Greenhalgh, Robert, MacFarlane, Bate, & Kyriakidou, 2004). Adding to this complexity is the need to simultaneously and practically measure a variety of variables over time. Reviews of the field (Durlak & DuPre, 2008; Greenhalgh et al., 2004) have concluded that the wide variation in methodology, measures, and use of terminology across studies limits interpretation and prevents meta analyses with regard to dissemination diffusion and implementation studies. Attempts to analyze components of implementation have used 1) very general measures (e.g. Landenberger & Lipsey, 2005; Mihalic & Irwin, 2003) that do not specifically address core implementation components, 2) measures specific to a given innovation (e.g. Olds, Hill, O'Brien, Racine, & Moritz, 2003; Schoenwald, Sheidow, & Letourneau, 2004) that may lack generality across programs, or 3) measures that only indirectly assess the influences of some of the core implementation components (e.g. Klein, Conn, Smith, Speer, & Sorra, 2001; Aarons, Cafri, Lugo, & Sawitzky, 2012) Allison Metz 3

4 Implementation Drivers: Assessing Best Practices In response to the need for a measure that addresses core implementation components and generalizable across programs, the Implementation Drivers: Assessing Best Practices assessment is specific to best s extracted from: 1) the literature, 2) interactions with purveyors who are successfully implementing evidence based programs on a national scale, 3) in depth interviews with 64 evidence based program developers, 4) meta analyses of the literature on leadership, and 5) analyses of leadership in education (Blase, Fixsen, Naoom, & Wallace, 2005; Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Heifetz & Laurie, 1997; Kaiser, Hogan, & Craig, 2008; Naoom, Blase, Fixsen, Van Dyke, & Bailey, 2010; Rhim, Kowal, Hassel, & Hassel, 2007). Validation Ogden et al. (2012) at the Atferdssenteret Norsk senter for studier av problematferd og innovativ praksis Universitet i Oslo (The Norwegian Center for Child Behavioral Development, University of Oslo) validated a previous version of the Drivers Best Practices items. Ogden et al. collected data to establish the reliability and validity of the Implementation Driver items. The researchers interviewed 218 practitioners, supervisors, and managers associated with two wellestablished evidence based programs in Norway. The Cronbach alphas obtained in their study were: selection, 0.89; training, 0.91; coaching, 0.79; fidelity, 0.89; decision support data systems, 0.84; facilitative administration, 0.82; systems intervention, 0.82; and leadership, Metz et al. (2014) assessed Implementation Drivers in a county social service system before, during, and after implementation capacity was developed. Low scores on the Drivers assessment at baseline were associated with low levels of fidelity use of the innovation. As implementation capacity was developed, the scores on the Drivers assessment increased (nearly doubled). Higher scores on the Drivers assessment were related to much higher fidelity use of the innovation. For more information on the Implementation Drivers and other active implementation frameworks derived by the National Implementation Research Network, visit Definitions There are 3 categories of Implementation Drivers: 1) Competency Drivers are components to develop, improve and sustain one s ability to use an intervention as intended in order to benefit children, families and communities. 2) Organization Drivers are components to create and sustain hospitable organizational and system environments for full and effective use of intended services. 3) Leadership Drivers are components to provide the right leadership strategies for the different types of leadership challenges. These leadership challenges often emerge as part of the change management process needed to make decisions, provide guidance, and support organization functioning Allison Metz 2

5 Implementation Drivers: Assessing Best Practices Implementation Drivers 2015 Allison Metz 3

6 Implementation Drivers: Assessing Best Practices Introduction and Purpose ( What ) The Implementation Drivers are processes that can be leveraged to improve competence and to create a more hospitable organizational and systems environment for an evidence based program or (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005). Since effective implementation supports require changes at, organization, and State and Federal levels, implementation supports must be purposeful to create change in the knowledge, behavior, and attitudes of all the human service professionals and partners involved. The Assessment asks respondents to rate the Implementation Drivers that are in place currently, according to their own experiences. Respondents assess the Drivers through an implementation lens. After all, most organizations would say that they already recruit and select staff, provide orientation and some training, supervise their staff, etc. But what do these activities look like when they are focused on Effective Implementation Practices designed to create, organizational, and systems change at all levels? The items int the Assessment help to operationalize best s for each Driver Allison Metz 4

7 5 Implementation Drivers: Assessing Best Practices Instructions Pre requisite A pre requisite for effective use of the Implementation Drivers is a well operationalized intervention (program,, or innovation). When the core intervention components are clearlydefined and validated through research (e.g. fidelity correlated with outcomes; dosage and outcome data), the Implementation Drivers can be focused on bringing these core intervention components to life and sustaining and improving them in the context of s, organizations, and systems. Facilitator ( Who ) It is recommended that an individual with expertise in Active Implementation Frameworks facilitate the completion of this assessment. Over time, agencies and organizations using this assessment will gain the expertise to facilitate this process internally, but an outside facilitator with the necessary experience and skills is recommended for agencies using the assessment for the first time. Participants ( Who ) Assessment participants should include Implementation Team members who have a role in developing, monitoring, and improving implementation drivers. It is recommended that practitioners do not participate in the completion of the assessment. Only those individuals with responsibility for overseeing aspects of implementation drivers should complete the assessment. Facilitation and Use of this Measure ( HOW ) The assessment should be completed through expert facilitation with a group of implementation team members. The assessment was not developed as a self assessment. Facilitators should establish consensus scores for each best for each implementation driver. Consensus scores will lend themselves to subsequent action planning which is the purpose of this assessment.

8 6 Implementation Drivers: Assessing Best Practices Implementation Drivers Assessment: Fidelity Checklist Protocol Steps Step Completed? Y=Yes; N=No N/A= unsure or not applicable 1. Skilled Facilitator: An individual with expertise in Active Implementation Frameworks and Y N N/A skill in administering the assessment is identified to facilitate. 2. Respondents Invited: Administrator and/or Facilitator invites knowledgeable participants, Y N N/A including Implementation Team members who have a role in developing, monitoring and improving implementation drivers. Practitioners should not participate in assessment. 3. Intervention Identified: A well operationalized intervention (program, or Y N N/A innovation) is identified for the assessment. 4. Materials Prepared in Advance: Administrator and/or Facilitator ensures that copies Y N N/A (paper or electronic) of a blank Drivers Assessment are available for each participant and ensures that a room is set up with a laptop, projector, internet connection, and conference phone (video if possible) for any participants joining remotely 5. Overview: Administrator provides a review of the Drivers Assessment, purpose, and Y N N/A instructions for voting 6. Consent: Administrator obtains informed consent from participants to collect and use their response to understand implementation status and inform action planning. 7. Documentation: Facilitator documents date of the assessment, names and roles of Y N N/A participants, and the intervention being assessed. 8. Administration: Each section and question is read aloud. The Facilitator reads the Y N N/A description of the Driver and responds to any questions from participants about the driver s definition. The Facilitator reads the question, and then says, Ready, set, vote. All respondents vote simultaneously and publicly to neutralize influence during the voting process (e.g. hold up 2 fingers to vote fully in place, 1 finger to vote partially in place, or a closed hand to vote not in place or holds up a card with the number 0, 1, or 2) 9. Administration: Facilitator tallies the votes and notes agreement or discrepancies for each Y N N/A question 10. Consensus: If complete agreement is reached, move on to the next question. If not, the Y N N/A Facilitator invites an open, brief discussion of the reasons for differences in scoring. The group is asked to vote again. The vote can occur multiple times at the discretion of the Facilitator. The goal is to reach consensus. Consensus means that the minority voters can live with and support the majority decision on an item. If the minority persists in not being able to live with the majority vote, the Facilitator encourages further discussion at a later time and the majority vote is recorded so that the results can be scored and graphed. 11. Recording: Facilitator documents each scoring decision on Qualtrics or on the scoring form Y N N/A used to record all votes. 12. Note-taking: For items where there is further clarity or information needed, the Facilitator Y N N/A notes the question in the Notes section. 13. Data Summary: After the last question has been asked and answered, the Administrator Y N N/A generates the reports, and distributes graphs of total scores. 14. Review: While viewing the graphs, Administrator prompts the team in a discussion of the Y N N/A results to identify strengths and opportunities. If a repeated administration, Administrator highlights all of the subscales that moved in a positive direction and celebrates progress toward 80% or better subscale scores 15. Intervention Status Review: Facilitator initiates a discussion of updates on achievements, Y N N/A progress, and major milestones or barriers that have occurred since previous administration 16. Action Planning: Facilitator asks respondents to discuss three Drivers they would like to set Y N N/A

9 7 Implementation Drivers: Assessing Best Practices as agenda items for their regular meetings. Action plan template is provided to respondents to use. If doing so electronically, a one page summary or printed copy needs to be provided to respondents. 17. Planning: If there is not sufficient time for #11 and #12, the Facilitator ensures that a date and time are set for the Intervention Status Review and Action Planning 18. Conclusion: Administrator thanks the team for their openness and for sharing in the discussion Y N N/A Y N N/A

10 8 Implementation Drivers: Assessing Best Practices Scoring Form Today Date: Individuals Participating in the Assessment: Program or Practice: Facilitator (s): Directions: Individuals complete the Drivers Best Practices Assessment together by using the Scoring Guide to discuss each item and come to consensus on the final score for each item. If the team is unable to arrive at consensus, additional data sources for each item are documented in the Scoring Guide and should be used to help achieve consensus on future administrations. Scores are recorded on this Scoring Form below. Item Selection 1. There is someone accountable for the recruitment and selection of staff who will carry out the program or Score Job descriptions are in place for staff positions that will carry out the program or Interviewers understand the skills and abilities needed for the staff position Interview protocols are in place to assess candidates competencies for the staff positions that will carry out the program or Interview processes are regularly reviewed Training 6. There is someone accountable for the training of staff who will carry out the program or Agency staff provides or secures skill based training for staff Agency staff uses training data to target competency development and improve training Coaching 9. There is someone accountable for the coaching of staff who will carry out the program or 10. Coaching is provided to improve the competency of staff who carry out the program or Agency staff uses a coaching service delivery plan Agency staff regularly assess coaching effectiveness Fidelity 13. There is someone accountable for the fidelity assessments of staff who will carry out the program or 2 1 0

11 9 Implementation Drivers: Assessing Best Practices Item Score 14. The agency supports the use of a consistent fidelity measure for the program or Agency staff follow a protocol for fidelity assessments 16. Agency staff use fidelity assessment data to improve program and outcomes and implementation supports Decision-Support Data System 17. There is someone accountable for the decision support data system Data are useful Agency staff have access to relevant data for making decisions for program improvement Agency staff have a process for using data for decision making Facilitative Administration 21. Leaders and managers actively facilitate the use of implementation supports for programs and s Leaders and managers use an effective meeting process Leaders and managers actively seek feedback from staff and recipients Leaders and managers regularly use feedback from staff, stakeholders, and beneficiaries Systems Intervention 25. Leaders and managers engage with the larger service delivery and funding systems to create improved regulatory and funding environment 26. Leaders and managers engage key stakeholders and partners in supporting the program or Leadership 27. Agency leaders assesses contextual and big picture issues related to implementation of program or 28. Agency leaders identify adaptive challenges related to implementation (i.e., challenges that do not have a clear or agreed upon definition or a readily identifiable solution) Agency leaders focus attention on implementation challenges Agency leaders involve other agency staff and/or stakeholders in solving challenges Agency leaders ensure that difficult issues and challenges are raised and considered by staff and stakeholders 2 1 0

12 10 Scoring Rubric Drivers Item 2 points 1 point 0 points Data Source Selection Who is responsible for this driver (e.g. implementing agency, program developer, state agency)?: 1. There is someone accountable for the recruitment and selection of staff who will carry out the program or A specific person is responsible for coordinating the quality and timeliness of recruitment and selection processes for staff supporting the program or AND This person is able to execute the responsibilities related to his/her role in the selection process A specific person is responsible for coordinating the quality and timeliness of recruitment and selection processes for staff supporting the program There is not a specific person responsible for coordinating the quality and timeliness of recruitment and selection processes for staff supporting the program or Job description of person accountable for recruitment and selection 2. Job descriptions are in place for staff positions that will carry out the program or Job descriptions are clear about expectations for each position AND Job descriptions explicitly align with the s and competencies required for the program to be used competently Job descriptions are clear about expectations for each position Job descriptions are not clear about expectations for each position Job descriptions

13 11 Drivers Item 2 points 1 point 0 points Data Source 3. Interviewers understand the Interviewers have knowledge, skills, and Interviewers have knowledge, skills, and abilities related to Interviewers have little or no knowledge, skills, and abilities skills and abilities abilities related to the staff the staff position related to the staff position position needed for the ANDstaff position Interviewers accurately assess applicant knowledge, skills, and abilities 4. Interview protocols are in place to assess candidates competencies for the staff positions that will carry out the program or Job interview protocol includes all of the following: an assessment of core skills needed for position specific procedures (e.g., scenario, role play) for assessing candidate capacity to perform key skills assess applicant s ability to receive feedback professionally assess applicant s ability to use feedback provided during the interview to improve Review of adherence to the interview protocol is documented Ratings of applicant responses are recorded Job interview protocol includes all of the following: an assessment of core skills needed for position review of adherence to the interview protocol is documented ratings of applicant responses are recorded Generic job interview protocol (e.g., similar protocol for any position) exists Interview protocol (including procedures used during the selection process); data showing the results of core skills assessments

14 12 5. Interview processes are regularly reviewed Drivers Item 2 points 1 point 0 points Data Source Interview processes are Interview processes are Interview processes are not annually reviewed and annually reviewed and revised annually reviewed and revised Selection and revised as needed to as needed to improve the as needed to improve the Interview process improve the selection selection process selection process documentation process AND AND The annual review examines at Data on interview The annual review examines least one of the following: outcomes at least three of the Interview results (e.g. following: protocol adherence; Interview results (e.g. applicant responses) protocol adherence; Pre post training data for applicant responses) successful applicants Pre post training data for successful applicants Turnover data Fidelity data Exit interview results Turnover data Fidelity data Exit interview results Training Who is responsible for this driver (e.g. implementing agency, program developer, state agency)?: 6. There is someone accountable for the training of staff who will carry out the program or A specific person is responsible for coordinating the quality and timeliness of training processes for staff supporting the program or AND This person is able to execute the responsibilities related to his/her role in the training process A specific person is responsible for coordinating the quality and timeliness of training processes for staff supporting the program There is not a specific person responsible for coordinating the quality and timeliness of training processes for staff supporting the program or Job description of person accountable for training

15 13 Drivers Item 2 points 1 point 0 points Data Source 7. Agency staff provides or Training is required and provided before staff begin Training is required and provided before staff begin to Training is not required and/or is not provided before Professional learning schedule secures skill based to use the program or use the program or staff begin to use the new AND program or Training outlines or training for staff AND Highly competent individuals OR agendas Highly competent provide training (e.g., trainers Highly competent individuals individuals provide training who have deep content do not provide training (e.g., Training evaluations (e.g., trainers who have deep content knowledge knowledge and effective presentation delivery skills) trainers who have deep content knowledge and Presenter and effective presentation delivery skills) AND Training is skill based and effective presentation delivery qualifications skills) Agendas for training presenters includes opportunities for /behavioral rehearsals for essential skills and includes both positive and constructive feedback to participants

16 14 Drivers Item 2 points 1 point 0 points Data Source 8. Agency staff uses training data to target competency development and improve training Training assessment data (e.g., pre post assessments of individual trainee knowledge and skill) are collected and provided to supervisors and coaches in a timely manner to target trainee competency development AND Training assessment data are used by individuals accountable for recruitment and selection to improve recruitment and selection activities AND Training assessment data are reviewed and used by training staff to improve future training events, materials, and processes. Training assessment data (e.g., pre post assessments of individual trainee knowledge and skill) are collected and are not provided to supervisors and coaches in a timely manner to target trainee competency OR Training assessment data are collected but not used by individuals accountable for recruitment and selection to improve recruitment and selection activities OR Training assessment data are not reviewed and used by training staff to improve future training events, materials, and processes. Coaching Who is responsible for this driver (e.g. implementing agency, program developer, state agency)?: Training assessment data are not collected or used Training outcome data Evidence that data are used for improvements 9. There is someone accountable for the coaching of staff who will carry out the program or A specific person is responsible for coordinating the quality and timeliness of coaching processes for staff supporting the program or AND This person is able to execute the responsibilities related to his/her role in the coaching process A specific person is responsible for coordinating the quality and timeliness of coaching processes for staff supporting the program There is not a specific person responsible for coordinating the quality and timeliness of coaching processes for staff supporting the program or Job description of person accountable for coaching

17 Coaching is provided to improve the competency of staff who carry out the program or Drivers Item 2 points 1 point 0 points Data Source The staff who carry out the The staff who carry out the The staff who carry out the Coaching schedules program or receive program or receive program or do not coaching at least monthly coaching at least monthly receive coaching at least Samples of coaching AND AND monthly feedback data Coaches feedback to staff is Coaches feedback to staff is based on direct observation based on one of the following: (e.g. face to face, audio or Group or individual video recording) and at least consultation one other data source such Product or document as: review Group or individual Interviews with key consultation stakeholders Product or document review Interviews with key stakeholders 11. Agency staff uses a coaching service delivery plan A written plan outlines the coaching supports provided to staff who carry out the program or including: requirements for coaches to be experts in delivering the program or frequency of coaching coaching methods AND Adherence to the plan is reviewed at least three times a year A written plan outlines the coaching supports provided to staff who carry out the program or including: requirements for coaches to be experts in delivering the program or frequency of coaching coaching methods A written coaching service delivery plan does not exist Sample of coaching service delivery plans Content and concept lists used by coaches

18 16 Drivers Item 2 points 1 point 0 points Data Source 12. Agency staff regularly assess Agency staff assess effectiveness of coaching The effectiveness of coaching to improve the competency of Coaching effectiveness is not assessed Coaching effectiveness data coaching quarterly through the use of staff who carry out the such as staff two or more of the program or is assessed satisfaction surveys effectiveness following data sources: at least annually through the Practitioner fidelity use of at least one of the Evidence the data are Coach/supervisor fidelity following data sources: used to inform Satisfaction surveys Practitioner fidelity from those being Coach/supervisor fidelity improvements in coaching methods coached Satisfaction surveys from Observations of coaches those being coached conducting coaching Observations of coaches activities ANDconducting coaching activities Coaching effectiveness data are used to inform improvements in recruitment and selection, training, and other implementation supports Fidelity Who is responsible for this driver (e.g. implementing agency, program developer, state agency)?: 13. There is someone accountable for the fidelity assessments of staff who will carry out the program or A specific person is responsible for coordinating the quality and timeliness of fidelity assessments processes for staff supporting the program or AND This person is able to execute the responsibilities related to his/her role in performance assessment process A specific person is responsible for coordinating the quality and timeliness of fidelity assessments processes for staff supporting the program There is not a specific person responsible for coordinating the quality and timeliness of fidelity assessments processes for staff supporting the program or Job description of person accountable for fidelity assessments

19 The agency supports the use of a consistent fidelity measure for the program or Drivers Item 2 points 1 point 0 points Data Source The fidelity measures: The fidelity measures: The agency does not support Content Measures Content Measures the use of a consistent fidelity whether the practitioner whether the practitioner is measure is following the following the guidelines of guidelines of the the program (e.g., program (e.g., compliance to model compliance to model standards such as # of standards such as # of home visits; caseload, home visits; caseload, curriculum) curriculum) Competence Measures the extent to which the practitioner demonstrates skill in the delivery of services (e.g., skills, interactions with families) Context Measures the extent to which the prerequisites and conditions for the program to operate are met Is demonstrated to be correlated with outcomes Context Measures the extent to which the prerequisites and conditions for the program to operate are met

20 Agency staff follow a protocol for fidelity assessments Drivers Item 2 points 1 point 0 points Data Source Agency staff follow a written Agency staff follow a written Performance protocol that includes all of protocol that includes the Agency staff do not follow a assessment (fidelity) the following: following: written protocol for fidelity protocol Staff are oriented to assessments how fidelity is assessed Fidelity assessments use multiple sources of information (e.g., practitioners, supervisors, consumers) Fidelity assessment data are used to improve supports for practitioners Fidelity assessment data are not used for annual staff evaluations or salary recommendations. Fidelity assessment data are used to improve supports for practitioners Fidelity assessment data are not used for annual staff evaluations or salary recommendations. Documentation of staff performance (fidelity) assessments Policy and procedures related to annual reviews and/or salary recommendations. 16. Agency staff use fidelity assessment data to improve program and outcomes and implementation supports Agency staff review fidelity assessment data at least four times per year to create action plans to: Assess and improve the effectiveness of selection, training and coaching processes for practitioners Agency staff review fidelity assessment data at least annually Agency staff do not review fidelity assessment data Documentation of action plans for improvement of selection, training, or coaching processes. Documentation of feedback to coaches and/or trainers Documentation of feedback provided to practitioners

21 19 Drivers Item 2 points 1 point 0 points Data Source Decision-Support Data System Who is responsible for this driver (e.g. implementing agency, program developer, state agency)?: 17. There is someone accountable for the decisionsupport data system 18. Data are useful A specific person is responsible for coordinating the content, quality, and timeliness of the data system to support decisions regarding the use of a program or and implementation supports available in the organization AND This person is able to execute the responsibilities related to his/her role in overseeing the decision support data system Data and information are collected systematically and prepared for use so they are: Reliable (standardized protocols, trained data collectors) Valid (useful indicators of the concepts or s being assessed) Reported in a timely manner (when/to whom the data are most useful) Built into regular routines A specific person is responsible for coordinating the content, quality, and timeliness of a data system to support decisions regarding the use of a program or and implementation supports available in the organization Data and information are collected systematically and prepared for use so they are: Reliable (standardized protocols, trained data collectors) Valid (useful indicators of the concepts or s being assessed) There is no person responsible for coordinating the content, quality, and timeliness of a data system to support decisions regarding the use of a program or and implementation supports available in the organization Data and information are not collected systematically and prepared for use Job description of person accountable for decision support data system

22 Agency staff have access to relevant data for making decisions for program improvement Drivers Item 2 points 1 point 0 points Data Source Agency staff have access to Agency staff have access to the Agency staff do not have all of the following relevant following relevant data to access to relevant data data to analyze for program analyze for program improvement: improvement: Fidelity data Programmatic/ Outcome data financial data Programmatic/ financial data 20. Agency staff have a process for using data for decisionmaking Agency staff have a process for using data for decisionmaking that includes all of the following: The data are analyzed and summarized at least quarterly Data summaries are communicated clearly in written reports to agency staff Action plans are developed to improve implementation supports and outcomes Data summaries and action plans are shared with key stakeholders (e.g., community, family members) Agency staff have a process for using data for decision making that includes two of the following: The data are analyzed and summarized at least quarterly Data summaries are communicated clearly in written reports to agency staff Action plans are developed to improve implementation supports and outcomes Data summaries and action plans are shared with key stakeholders (e.g., community, family members) Agency staff do not have a process for using data for decision making Documentation of processes used by agency to review data and make decisions Sample data reports Sample action plans

23 Drivers Item 2 points 1 point 0 points Data Source Facilitative Administration Who is responsible for this driver (e.g. implementing agency, program developer, state agency)?: 21. Leaders and managers actively facilitate the use of implementation supports for programs and s Leaders and managers accommodate and support the use of implementation best s by: Making changes in organization roles, functions, and structures Making changes in organization policies and procedures Making use of data to inform decisions and action planning Leaders and managers accommodate and support the use of implementation best s by doing at least one but not all of the following: Making changes in organization roles, functions, and structures Making changes in organization policies and procedures Making use of data to inform decisions and action planning Leaders and managers do not accommodate and support the use of implementation best s Management team meeting minutes Action plans Reports from staff who carry out programs or s Reports from staff who carry out improvement initiatives focused on implementation best s. 21

24 22 Drivers Item 2 points 1 point 0 points Data Source 22. Leaders and managers use an Leaders and managers use all of the following effective Leaders and managers use at least two of the following Leaders and managers do not use effective meeting Meeting schedule effective meeting meeting processes: effective meeting processes: processes Meeting Agendas, process meets in person at least meets in person at least Minutes, and monthly or more monthly or more Attendance frequently depending frequently depending on on amount of work amount of work Action Plan meeting roles and responsibilities are consistently assigned and used (e.g., facilitator, recorder, time keeper, norms monitor) process is in place for absent staff to receive updates shortly following the meeting completes assignments and documents progress outlined on an action plan within designated timelines meeting roles and responsibilities are consistently assigned and used (e.g., facilitator, recorder, time keeper, norms monitor) Process is in place for absent staff to receive updates shortly following the meeting completes assignments and documents progress outlined on an action plan within designated timelines

25 Leaders and managers actively seek feedback from staff and recipients Drivers Item 2 points 1 point 0 points Data Source Leaders and managers Leaders and managers actively Leaders and managers do not Written plan for actively seek feedback from seek feedback from at least actively seek feedback from feedback loops to all of the following groups: one of the following groups: staff, stakeholders, and reduce Staff who are using the Staff who are using the beneficiaries administrative program or s program or s barriers staff who are providing implementation support stakeholders (e.g. parents, teachers, caseworkers) intended beneficiaries (e.g. children, families, students, community members) staff who are providing implementation support stakeholders (e.g. parents, teachers, caseworkers) intended beneficiaries (e.g. children, families, students, community members) Data reports Action plans 24. Leaders and managers regularly use feedback from staff, stakeholders, and beneficiaries Leaders and managers use the data collected from staff and stakeholders to reduce internal administrative barriers in the agency to using the program or fully and effectively AND Leaders persist in using the data collected from staff and stakeholders until each barrier is reduced or eliminated Leaders and managers use the data collected from staff and stakeholders to reduce internal administrative barriers in the agency to using the program or fully and effectively Leaders and managers do not have or use data collected from staff and stakeholders to reduce internal administrative barriers in the agency to using the program or fully and effectively

26 Drivers Item 2 points 1 point 0 points Data Source Systems Intervention Who is responsible for this driver (e.g. implementing agency, program developer, state agency)?: 25. Leaders and managers engage with the larger service delivery and funding systems to create improved regulatory and funding environments Leaders and managers in the organization attend regular meetings with funders, system managers and leaders, and other provider organizations AND Information is shared regarding systemic facilitators and barriers to quality of : programs or s Leaders and managers in the organization attend regular meetings with funders, system managers and leaders, and other provider organizations AND Information is shared regarding systemic facilitators and barriers to quality of : programs or s implementation supports Leaders and managers in the organization do not attend regular meetings with funders, system managers and leaders, and other provider organizations to discuss and resolve systemic issues Meeting agendas Membership lists Data reports Action plans Guidance document outlining policy communication implementation supports AND Systemic changes are proposed to the larger system to create a more supportive environment for programs and s 26. Leaders and managers engage key stakeholders and partners in supporting the program or Leaders and managers have a plan in place to communicate with key stakeholders quarterly Leaders and managers have a plan in place to communicate with key stakeholders at least twice a year Leaders and managers do not have a plan in place to communicate with stakeholders Communication plan Stakeholder surveys Implementation team membership Team meeting minutes 24

27 25 Drivers Item 2 points 1 point 0 points Data Source Leadership 27. Agency leaders assesses contextual and big picture issues related to implementation of program or 28. Agency leaders identify adaptive challenges related to implementation (i.e., challenges that do not have a clear or agreed upon definition or a readily identifiable solution) Agency leaders regularly assess contextual issues (e.g., political, demographic, funding, values, and philosophical issues) AND Agency leadership, at least twice a year discusses with staff and other key stakeholders how the program or aligns with organization s vision, values, and philosophy Agency leaders verbally label and describe conflicting values and different perspectives on problems and solutions AND Agency leaders ask people inside and outside the organization: for their concerns and ideas related to challenges and include these ideas in future meetings for their feedback on what the leaders are doing or not doing that contributes to the challenges Agency leaders regularly assess contextual issues (e.g., political, demographic, funding, values, and philosophical issues) Agency leaders verbally label and describe conflicting values and different perspectives on problems and solutions Agency leaders do not regularly assess contextual issues (e.g., political, demographic, funding, values, and philosophical issues) Agency leaders do not verbally label and describe conflicting values and different perspectives on problems and solutions Leadership Team meeting notes Inter and intraagency communication materials (e.g., memos, papers, briefs) Leadership team presentations (e.g. Powerpoints) Leadership Team meeting notes Surveys and survey reports from staff and/or stakeholders Written documents (e.g. memos, briefs) from agency leaders that describe adaptive challenges 29. Agency leaders Agency leaders ask team Agency leaders ask team Agency leaders do not ask Meeting notes or

28 26 Drivers Item 2 points 1 point 0 points Data Source focus attention on implementation challenges members to refocus their efforts on resolving implementation challenges team members to refocus their efforts on resolving implementation challenges minutes 30. Agency leaders involve other agency staff and/or stakeholders in solving challenges members to refocus their efforts on resolving implementation challenges AND Agency leaders reinforce or support staff in maintaining focus on problem solving issues Agency leaders support staff in their implementation work through all of the following strategies: Specifying roles and responsibilities in writing Ensuring staff have the time and resources for problem solving Stating clearly the level of authority for decision making Recruiting system stakeholders for meaningful input and participation Agency leaders support staff in their implementation work through at least two of the following strategies: Specifying roles and responsibilities in writing Ensuring staff have the time and resources for problem solving Stating clearly the level of authority for decisionmaking Recruiting system stakeholders for meaningful input and participation Agency leaders do not involve staff and stakeholders in solving challenges. Written documents (e.g. memos, briefs, revisions to Terms of Reference) Staff and stakeholder surveys rating how on mission or on task the team has been Terms of Reference FTE allocations in Position descriptions Survey results from staff and stakeholders Lists of workgroups and committee chairs and participants 31. Agency leaders Agency leaders ensure that Agency leaders ensure that at Agency leaders do not provide

29 27 Drivers Item 2 points 1 point 0 points Data Source ensure that at least all of the strategies least 2 of the following opportunities for staff and difficult issues and are used: strategies are used: stakeholders to raise and challenges are ask staff and ask staff and stakeholders discuss implementation raised and stakeholders to to verbalize both challenges considered by staff and stakeholders verbalize both advantages and disadvantages of advantages and disadvantages of proposed solutions. proposed solutions. provide at least one annual opportunity for staff and stakeholders to raise concerns and propose solutions (e.g. staff surveys, one toone meetings) set and document ground rules and expectations related to difficult issues (e.g. respectful language, tone of voice, active listening, asking questions for clarification) provide at least one annual opportunity for staff and stakeholders to raise concerns and propose solutions (e.g. staff surveys, one to one meetings) set and document ground rules and expectations related to difficult issues (e.g. respectful language, tone of voice, active listening, asking questions for clarification)

30 Implementation Drivers: Assessing Best Practices

31 Implementation Drivers: Assessing Best Practices References Aarons, G. A., Cafri, G., Lugo, L., & Sawitzky, A. (2012). Expanding the Domains of Attitudes Towards Evidence-Based Practice: The Evidence Based Practice Attitude Scale-50. Administration and Policy in Mental Health and Mental Health Services Research, 39(5), doi: /s Blase, K. A., Fixsen, D. L., Naoom, S. F., & Wallace, F. (2005). Operationalizing implementation: Strategies and methods. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, doi: /s Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation Research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Greenhalgh, T., Robert, G., MacFarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82(4), Heifetz, R. A., & Laurie, D. L. (1997). The work of leadership. Harvard Business Review, 75(1), Kaiser, R. B., Hogan, R., & Craig, S. B. (2008). Leadership and the fate of organizations. American Psychologist, 63(2), Klein, K. J., & Sorra, J. S. (1996). The challenge of innovation implementation. Academy of Management Review, 21(4), Klein, K. J., Conn, B., Smith, A., Speer, D. B., & Sorra, J. (2001). Implementing computerized technology: An organizational analysis. Journal of Applied Psychology, 86(5), Landenberger, N. A., & Lipsey, M. W. (2005). The Positive Effects of Cognitive Behavioral Programs for Offenders: A Meta Analysis of Factors Associated with Effective Treatment. Journal of Experimental Criminology, 1(4), Metz, A., Bartley, L., Ball, H., Wilson, D., Naoom, S., & Redmond, P. (2014). Active Implementation Frameworks (AIF) for successful service delivery: Catawba County child wellbeing project. Research on Social Work Practice, 1 8.

32 Implementation Drivers: Assessing Best Practices doi: / Mihalic, S., & Irwin, K. (2003). Blueprints for Violence Prevention: From Research to Real World Settings Factors Influencing the Successful Replication of Model Programs. Youth Violence and Juvenile Justice, 1(4), Naoom, S. F., Blase, K., Fixsen, D. L., Van Dyke, M., & Bailey, F. W. (2010). Implementing Evidence Based Programs in the Real World: Lessons Learned from Model Program Developers and Purveyors. Chapel Hill, NC: National Implementation Research Network, FPG Child Development Institute, UNC. Terje Ogden, Gunnar Bjørnebekk, John Kjøbli, Joshua Patras, Terje Christiansen, Knut Taraldsen, Nina Tollefsen (2012). Measurement of implementation components ten years after a nationwide introduction of empirically supported programs a pilot study. Implementation Science 2012, 7:49 (access the article at pdf) Olds, D. L., Hill, P. L., O'Brien, R., Racine, D., & Moritz, P. (2003). Taking preventive intervention to scale: The nurse family partnership. Cognitive and Behavioral Practice, 10, Rhim, L. M., Kowal, J. M., Hassel, B. C., & Hassel, E. A. (2007). School turnarounds: A review of the cross sector evidence on dramatic organizational improvement. Lincoln, IL: Public Impact, Academic Development Institute. Schoenwald, S. K., Sheidow, A. J., & Letourneau, E. J. (2004). Toward Effective Quality Assurance in Evidence Based Practice: Links Between Expert Consultation, Therapist Performance assessment, and Child Outcomes. Journal of Clinical Child and Adolescent Psychology, 33(1), Van Meter, D. S., & Van Horn, C. E. (1975). The policy implementation process: A conceptual framework. Administration & Society, 6,

Implementation Drivers: Assessing Best Practices

Implementation Drivers: Assessing Best Practices Implementation Drivers: Assessing Best Practices Adapted with permission by The State Implementation & Scaling up of Evidence based Practices Center (SISEP) Based on the work of The National Implementation

More information

February 2011 With. a synthesis. have. implementation. (Van Meter. & Van. used 1) very. Components. Assessment. Evidence-based.

February 2011 With. a synthesis. have. implementation. (Van Meter. & Van. used 1) very. Components. Assessment. Evidence-based. Stage-Based Measures of Implementation Components SISEP The role of the SISEP Center is to build the capacity of state education systems to implement and scale up effectivee education innovations statewide,

More information

IMPLEMENTATION SCIENCE BREAKOUT

IMPLEMENTATION SCIENCE BREAKOUT IMPLEMENTATION SCIENCE BREAKOUT ACTIVE IMPLEMENTATION FRAMEWORKS FROM THE LENSE OF STATE EARLY EDUCATION ADMINISTRATORS OBJECTIVES RELATED TO IMPLEMENTATION SCIENCE Identify the components of Active Implementation

More information

Wake County Public Schools

Wake County Public Schools Implementation Drivers: Selection Competency Drivers are the activities to develop, improve, and sustain a practitioner s ability to put a program into practice so that families and children can benefit.

More information

Allison Metz, Ph.D. Leah Bartley, MSW, Ph.D. Debra Lancaster, Chief Program Officer, DCF. September 8, 2017

Allison Metz, Ph.D. Leah Bartley, MSW, Ph.D. Debra Lancaster, Chief Program Officer, DCF. September 8, 2017 Supporting the Sustainable Use of Research Evidence in Child Welfare Services An Implementation Science and Service Provider-Informed Blueprint for Integration of Evidence-Based/Evidence-Informed Practices

More information

7/8/2014. Purpose. Active Implementation Frameworks. Cross Cutting Themes. Introduction and Overview (Darren s slides)

7/8/2014. Purpose. Active Implementation Frameworks. Cross Cutting Themes. Introduction and Overview (Darren s slides) Strategies for Implementing Evidence Based s for Transitional Aged Youth, Adolescents, Children and Families Affected by Substance Use and Co Occurring Mental Health Disorders DISCUSSANT: Darren Fulmore,

More information

The Hexagon Tool: Exploring Context

The Hexagon Tool: Exploring Context The Hexagon Tool: Exploring Context Adapted with permission by The State Implementation & Scaling up of Evidence based Practices Center (SISEP) Based on the work of The National Implementation Research

More information

Implementing Evidence to Achieve Child Welfare Outcomes

Implementing Evidence to Achieve Child Welfare Outcomes Implementing Evidence to Achieve Child Welfare Outcomes New Jersey Task Force on Child Abuse and Neglect 2017 Biennial Conference Allison Metz, Ph.D. Director and Senior Scientist September 8, 2017 Implementation

More information

Supporting sustainable implementation of research evidence in child welfare

Supporting sustainable implementation of research evidence in child welfare Supporting sustainable implementation of research evidence in child welfare Allison Metz, Ph.D. Director and Senior Scientist Tate Talk UNC School of Social Work October 11, 2016 Objectives Active Implementation

More information

Organization and Practice Change through an Implementation Lens

Organization and Practice Change through an Implementation Lens Organization and Practice Change through an Implementation Lens 1 Implementation Drivers - Best Practices National Implementation Research Network Karen A. Blase, Melissa K. Van Dyke, Dean L. Fixsen May

More information

Implementation Science Institute. June 13 th 14 th, 2018

Implementation Science Institute. June 13 th 14 th, 2018 Implementation Science Institute June 13 th 14 th, 2018 Building Fidelity Measures Caryn Ward, PhD NIRN @ UNC-FPG Child Development Institute Paul Lanier, PhD UNC School of Social Work Objectives The (Child-Welfare)

More information

Assessing Capacity for the Scale Up of Effective Parenting and Family Support Programs in Community Public Health Collaborations

Assessing Capacity for the Scale Up of Effective Parenting and Family Support Programs in Community Public Health Collaborations Assessing Capacity for the Scale Up of Effective Parenting and Family Support Programs in Community Public Health Collaborations William Aldridge, Claire Veazey, Desiree Murray FPG Child Development Institute,

More information

The Hexagon: An Exploration Tool. Hexagon Discussion & Analysis Tool Instructions NIRN NATIONAL IMPLEMENTATION RESEARCH NETWORK 2018 NIRN

The Hexagon: An Exploration Tool. Hexagon Discussion & Analysis Tool Instructions NIRN NATIONAL IMPLEMENTATION RESEARCH NETWORK 2018 NIRN The Hexagon: An Exploration Tool Hexagon Discussion & Analysis Tool Instructions NIRN NATIONAL IMPLEMENTATION RESEARCH NETWORK 2018 NIRN 2018 NIRN University of North Carolina at Chapel Hill Metz, A. &

More information

Implementation Research: A Synthesis of the Literature

Implementation Research: A Synthesis of the Literature Implementation Research: A Synthesis of the Literature July 2007 Implementation Research: A Synthesis of the Literature, by Dean Fixsen, Sandra Naoom, Karen Blase, Robert Friedman, and Frances Wallace

More information

The Hexagon: An Exploration Tool. Hexagon Discussion & Analysis Tool Instructions NIRN NATIONAL IMPLEMENTATION RESEARCH NETWORK 2018 NIRN

The Hexagon: An Exploration Tool. Hexagon Discussion & Analysis Tool Instructions NIRN NATIONAL IMPLEMENTATION RESEARCH NETWORK 2018 NIRN The Hexagon: An Exploration Tool Hexagon Discussion & Analysis Tool Instructions NIRN NATIONAL IMPLEMENTATION RESEARCH NETWORK 2018 NIRN 2018 NIRN University of North Carolina at Chapel Hill Metz, A. &

More information

Thoughtful and Purposeful Implementation: Break Out Session

Thoughtful and Purposeful Implementation: Break Out Session Thoughtful and Purposeful Implementation: Break Out Session Dale Cusumano, Ph.D. Angela Preston, Ph.D. University of North Carolina Chapel Hill June 2017 Outcomes for this Session Gain a deeper understanding

More information

Leadership & Management Functions to Support Active Implementation and Scaling

Leadership & Management Functions to Support Active Implementation and Scaling Leadership & Management Functions to Support Active Implementation and Scaling Renée I. Boothroyd, Ph.D. & William A. Aldridge II, Ph.D. FPG Child Development Institute, UNC-Chapel Hill Pre re-symposium

More information

(c) William A. Aldridge II & Karen Blase,

(c) William A. Aldridge II & Karen Blase, 2013 1 2013 2 2013 3 Most programs clear the bar on #1, far fewer on #2, and very little on #3 and #4. Blase, K., & Fixsen, D.L. (2013). Core intervention components: Identifying and operationalizing what

More information

Implementation Science Institute

Implementation Science Institute Implementation Science Institute June 13 th 14 th, 2018 Usability of Programs and Practices Leah Bartley, MSW PHD Tonya Van Deinse, MSW, PHD Metz, A. (2016). Practice Profiles: A Process for Capturing

More information

Request for Proposals Activating Agency for Triple P South Carolina

Request for Proposals Activating Agency for Triple P South Carolina CHILDREN S TRUST OF SOUTH CAROLINA Request for Proposals Activating Agency for Triple P South Carolina Triple P South Carolina is seeking a local partner to serve as the Activating Agency (AA) to coordinate

More information

What are the main goals of district systems in PBIS?

What are the main goals of district systems in PBIS? What s Can Do to Establish Capacity for PBIS Implementation Kent McIntosh University of Oregon and state systems are the school s offensive line (McIntosh & Goodman, 2016) What are the main goals of district

More information

NCHRP ACTIVE IMPLEMENTATION MOVING RESEARCH INTO PRACTICE

NCHRP ACTIVE IMPLEMENTATION MOVING RESEARCH INTO PRACTICE NCHRP ACTIVE IMPLEMENTATION MOVING RESEARCH INTO PRACTICE im ple men ta tion A specified set of activities designed to put into practice an activity or products of known dimensions Implementation Science:

More information

Implementing the Pyramid Model State-Wide: State Benchmarks of Quality

Implementing the Pyramid Model State-Wide: State Benchmarks of Quality Implementing the Pyramid Model State-Wide: State Benchmarks of Barbara J. Smith, Glen Dunlap, Karen Blase May, 5 Purpose The State Benchmarks of is used by a collaborative State Leadership Team (SLT) to

More information

Somerset Day Reporting Center Implementing Effective Correctional Management of Offenders in the Community Implementation Checklist

Somerset Day Reporting Center Implementing Effective Correctional Management of Offenders in the Community Implementation Checklist Somerset Day Reporting Center Implementing Effective Correctional Management of Offenders in the Community Implementation Checklist Administration and Planning 1. Leadership 1=We haven't yet begun 2=We

More information

Person-Centered Positive Onsite Evaluation Tool (PC-POET) Tiered Implementation Guide

Person-Centered Positive Onsite Evaluation Tool (PC-POET) Tiered Implementation Guide Person-Centered Positive Onsite Evaluation Tool (PC-POET) Tiered Implementation Guide Date Organization/County Region/Cohort Step 1: Make Initial Contact A. Identify contact person(s) for the organization

More information

Steps for Recruiting, Interviewing, and Hiring After-School Staff

Steps for Recruiting, Interviewing, and Hiring After-School Staff SAMPLE MATERIAL Steps for Recruiting, Interviewing, and Hiring After-School Staff Young, New York Topic: Increased Learning Time: Beyond the Regular School Day Practice: Structure Time Through a partnership

More information

Strategic Monitoring Tools and Techniques for Successful Strategic Plan Implementation

Strategic Monitoring Tools and Techniques for Successful Strategic Plan Implementation Strategic Monitoring Tools and Techniques for Successful Strategic Plan Implementation An ECRA White Paper 2016 ECRA Group All Rights Reserved Strategic Monitoring 0 :. 1 :. Strategic Monitoring 1 :. Introduction

More information

21st Annual RTC Conference Presented in Tampa, February 2008

21st Annual RTC Conference Presented in Tampa, February 2008 Assessing and Changing Organizational Social Context for Effective Children s Services 21 st Annual Research Conference on Systems of Care Tampa, FL February 26, 2008 Charles Glisson, Ph.D. Children s

More information

State Leadership Team Benchmarks of Quality : Implementing Evidence-Based Practices Statewide May 2018

State Leadership Team Benchmarks of Quality : Implementing Evidence-Based Practices Statewide May 2018 State Leadership Team : Implementing Evidence-Based Practices Statewide May 8 Purpose The State is used by a collaborative State Leadership Team (SLT) to assess progress and plan future actions so that

More information

IMPLEMENTING CONTINUOUS QUALITY IMPROVEMENT PROCESSES FOR SAFETY ORGANIZED PRACTICE (SOP): PRACTICE PROFILES

IMPLEMENTING CONTINUOUS QUALITY IMPROVEMENT PROCESSES FOR SAFETY ORGANIZED PRACTICE (SOP): PRACTICE PROFILES IMPLEMENTING CONTINUOUS QUALITY IMPROVEMENT PROCESSES FOR SAFETY ORGANIZED PRACTICE (SOP): PRACTICE PROFILES National Human Services Training Evaluation Symposium May 22, 2014 Holly Bowers, PhD, Melanie

More information

Worksheet 3g.A1 Competency: Planning and Organizing Implementation Science Applied

Worksheet 3g.A1 Competency: Planning and Organizing Implementation Science Applied Worksheet 3g.A1 Competency: Planning and Organizing Implementation Science Applied Implementation Drivers Shared Vision, Values, and Mission A shared understanding of vision, mission, and values eists

More information

The Basics of Implementation Science

The Basics of Implementation Science The Basics of Implementation Science SSIP Interactive Institutes Albuquerque, NM; April 29-30, 2015 Jacksonville, FL; May 12-13, 2015 Chicago, IL; May 27-28, 2015 Susan Davis, IDC Session Overview Review

More information

Service Provider Readiness in Pay for Success Initiatives

Service Provider Readiness in Pay for Success Initiatives Service Provider Readiness in Pay for Success Initiatives Service Provider Readiness in Pay for Success Initiatives Jason Katz Michael Marks Hanno Petras Jennifer Loeffler-Cobia Barbara Broman With a grant

More information

WRAPAROUND IMPLEMENTATION AND PRACTICE QUALITY STANDARDS

WRAPAROUND IMPLEMENTATION AND PRACTICE QUALITY STANDARDS Fall 2016 WRAPAROUND IMPLEMENTATION AND PRACTICE QUALITY STANDARDS Wraparound Evaluation and Research Team Jennifer Schurer Coldiron Eric Bruns Spencer Hensley Ryan Paragoris This document was prepared

More information

WRAPAROUND IMPLEMENTATION AND PRACTICE QUALITY STANDARDS

WRAPAROUND IMPLEMENTATION AND PRACTICE QUALITY STANDARDS Fall 2016 WRAPAROUND IMPLEMENTATION AND PRACTICE QUALITY STANDARDS Wraparound Evaluation and Research Team Jennifer Schurer Coldiron Eric Bruns Spencer Hensley Ryan Paragoris This document was prepared

More information

The School District of Lee County Division of Operations

The School District of Lee County Division of Operations The School District of Lee County Division of Operations District Based Administrator Evaluation System 2013-2014 Academic Year Table of Contents District Based Administrator Evaluation System Overview...

More information

Wraparound Not the Runaround Part II: Implementation and Workforce Development

Wraparound Not the Runaround Part II: Implementation and Workforce Development Wraparound Not the Runaround Part II: Implementation and Workforce Development Objectives Provide an overview of 4 Key Elements Discuss elements of key drivers in implementation of innovative practices

More information

The Role of Implementation Drivers in Child Welfare Systems Change. David Lambert, PhD Tammy Richards, MEd Trish Knight, MPP

The Role of Implementation Drivers in Child Welfare Systems Change. David Lambert, PhD Tammy Richards, MEd Trish Knight, MPP The Role of Implementation Drivers in Child Welfare Systems Change David Lambert, PhD Tammy Richards, MEd Trish Knight, MPP Children s Mental Health Research & Policy Conference March 3-5, 2014 Tampa,

More information

Three Bold Steps Toolkit: Capacity Framework

Three Bold Steps Toolkit: Capacity Framework Three Bold Steps Toolkit: Capacity Framework INTRODUCTION This framework was originally designed for Safe Schools/Healthy Students (SS/HS) communities to guide them as they took the Three Bold Steps: developing

More information

BELLEVILLE PUBLIC SCHOOLS Director of Operations/Chief Talent Officer

BELLEVILLE PUBLIC SCHOOLS Director of Operations/Chief Talent Officer Department: Central Office Reports to: Superintendent of Schools Number of Days: 12 Months Security Access: District Current Date: 2016 Overtime Status: Exempt Overview: BELLEVILLE PUBLIC SCHOOLS Director

More information

Evaluating Organizational and Systems Change. National Child Welfare Evaluation Summit May th, 2009 Washington, DC

Evaluating Organizational and Systems Change. National Child Welfare Evaluation Summit May th, 2009 Washington, DC Evaluating Organizational and Systems Change National Child Welfare Evaluation Summit May 27-29 th, 2009 Washington, DC Karen A. Blase,, Ph.D. Senior Scientist National Implementation Research Network

More information

Amy Gaumer Erickson, Ph.D., KS & M Julie Morrison, Ph.D., OH Pat Mueller, Ed.D., NH & MS. May 13, 2010

Amy Gaumer Erickson, Ph.D., KS & M Julie Morrison, Ph.D., OH Pat Mueller, Ed.D., NH & MS. May 13, 2010 Amy Gaumer Erickson, Ph.D., KS & M Julie Morrison, Ph.D., OH Pat Mueller, Ed.D., NH & MS May 13, 2010 1 2 Competency Drivers Competency Drivers help to develop, improve, and sustain the intervention. Selection,

More information

Creating Trauma-Informed Care Environments: An Organizational Self-Assessment

Creating Trauma-Informed Care Environments: An Organizational Self-Assessment Creating Trauma-Informed Care Environments: An Organizational Self-Assessment Organization/Program: Date Completed: Job Title: Instructions Making the transition to a Trauma-Informed Care environment requires

More information

joint SyStem innovation brief

joint SyStem innovation brief WIOA Section 188 & AJC Certification: A Window of Opportunity to Impact Equal Opportunity Policy & Practice to Better Serve Individuals with Disabilities joint SyStem innovation brief January 2019 A Joint

More information

Using a Team Approach to Implement Person- Centered Practices

Using a Team Approach to Implement Person- Centered Practices Using a Team Approach to Implement Person- Centered Practices June, 2017 Rachel Freeman Institute on Community Integration Laura Birnbaum Dani Dunphy Saint Louis County Implementing Multi-Tiered Systems

More information

The superintendent routinely considers school or corporation goals when making personnel decisions.

The superintendent routinely considers school or corporation goals when making personnel decisions. 1.0 Human Resource Manager The superintendent uses the role of human resource manager to drive improvements in building leader effectiveness and student achievement. 1.1 The superintendent effectively

More information

Youth Success RFP Evidence Base for How Learning Happens 3. GHR Connects,

Youth Success RFP Evidence Base for How Learning Happens 3. GHR Connects, IMPORTANT REMINDER: Please reference the Community Investments Overview from the Agency Resources webpage, which includes information about United Way s population focus, Community Vision for Change, and

More information

Applying Implementation Science to State System Change

Applying Implementation Science to State System Change The Early Childhood Technical Assistance Center Applying Implementation Science to State System Change An Example of Improving the Governance System Component: Promulgating State Rules in a Hypothetical

More information

BENCHMARK EVALUATION RATING SCALE (BERS)

BENCHMARK EVALUATION RATING SCALE (BERS) Benchmark Evaluation Rating Scale (Version 1.0) 1 BENCHMARK EVALUATION RATING SCALE (BERS) 1. Student s Name: 3. Completed by (Write name below, check box): Practicum Supervisor (Placement: ) Internship

More information

POSITION NARRATIVE Vice President of Integration & Learning First 5 LA

POSITION NARRATIVE Vice President of Integration & Learning First 5 LA BACKGROUND POSITION NARRATIVE Vice President of Integration & Learning First 5 LA First 5 LA is a leading early childhood advocate organization created by California voters to invest Proposition 10 tobacco

More information

Applying Implementation Science to State System Change

Applying Implementation Science to State System Change The Early Childhood Technical Assistance Center Applying Implementation Science to State System Change An Example of Improving the Finance System Component: Implementation of a Family Cost Participation

More information

The Role of Implementation Drivers in Child Welfare Systems Change

The Role of Implementation Drivers in Child Welfare Systems Change The Role of Implementation Drivers in Child Welfare Systems Change David Lambert, PhD Tammy Richards, MEd Trish Knight, MPP Global Implementation Conference August 21, 2013 - Washington, DC Context Child

More information

Community Assessment Training

Community Assessment Training Communities That Care Community Assessment Training Next Steps Trainer s Guide (60 minutes) Slides for Module 6 Module 6... 6-1 We are here.... 6-2 Module 6 goal... 6-3 Objectives... 6-4 Developing a work

More information

Milestones & Benchmarks

Milestones & Benchmarks 2014 Center for Communities That Care, University of Washington Phase 1: Get Started 1.1 Organize the community to begin the Communities That Care Process. Designate a single point of contact to act as

More information

VIRGINIA S CORE SUPERVISOR/MANAGER COMPETENCIES Table of Contents 1

VIRGINIA S CORE SUPERVISOR/MANAGER COMPETENCIES Table of Contents 1 VIRGINIA S CORE SUPERVISOR/MANAGER COMPETENCIES Table of Contents 1 Topic 511: FUNDAMENTALS OF SUPERVISING CASEWORK STAFF 511-01: Ability to create and maintain a supportive working and learning environment

More information

EMT Associates, Inc. Approach to Conducting Evaluation Projects

EMT Associates, Inc. Approach to Conducting Evaluation Projects EMT Associates, Inc. Approach to Conducting Evaluation Projects EMT has been a leading small business in the evaluation field for over 30 years. In that time, we have developed an expertise in conducting

More information

JOB DESCRIPTION. PERSONNEL EVALUATION RESPONSIBILITY: Professional and non professional personnel if delegated by the principal

JOB DESCRIPTION. PERSONNEL EVALUATION RESPONSIBILITY: Professional and non professional personnel if delegated by the principal JACKSON PARISH SCHOOL BOARD JOB DESCRIPTION TITLE: Administrative Assistant QUALIFICATIONS: 1. Shall be at least the minimum requirements as stated in Bulletin 746 revised - Louisiana Standards for State

More information

Community Defined Evidence: L.A. County PEI Plan

Community Defined Evidence: L.A. County PEI Plan COUNTY OF LOS ANGELES DEPARTMENT OF MENTAL HEALTH Mental Health Services Act (MHSA) Prevention And Early Intervention (PEI) 1 Community Defined Evidence: L.A. County PEI Plan Lillian Bando, JD, MSW Lbando@dmh.lacounty.gov

More information

Assistant Manager, Behavioural Therapy (Existing position)

Assistant Manager, Behavioural Therapy (Existing position) Edmonton Catholic Schools is now accepting applications for the position of Assistant Manager, Behavioural Therapy (Existing position) Edmonton Catholic Schools is a large urban school district whose mission

More information

ACA STAFF TRAINING CERTIFICATES COMBINED LEARNER OUTCOME CHECKLIST BY COMPETENCY

ACA STAFF TRAINING CERTIFICATES COMBINED LEARNER OUTCOME CHECKLIST BY COMPETENCY ACA STAFF TRAINING CERTIFICATES COMBINED LEARNER OUTCOME CHECKLIST BY COMPETENCY The following checklist is a comprehensive listing of the specific skills/knowledge sets that a learner who has completed

More information

School Psychology PhD

School Psychology PhD Unit Assessment Plan Guidelines Office of the Provost and Academic Affairs School Psychology PhD Learning Goals and Objectives Develop Competency in: (1) Research (2) Ethical and Legal Standards (3) Individual

More information

Performance and Quality Improvement

Performance and Quality Improvement INTRODUCTION COA's Performance and Quality Improvement (PQI) standards encourage organizations to use data to identify areas of needed improvement and implement improvement plans in support of achieving

More information

Understanding and Building Compatibility between Organizational Climate and Casework Practice

Understanding and Building Compatibility between Organizational Climate and Casework Practice Understanding and Building Compatibility between Organizational Climate and Casework Practice Heidi Young, Training Administrator, Organizational Learning & Quality Improvement, NH DCYF Christine Tappan,

More information

Mindfulness Network. Teacher Training Programme (TTP) Lead Role Profile

Mindfulness Network. Teacher Training Programme (TTP) Lead Role Profile Mindfulness Network Teacher Training Programme (TTP) Lead Role Profile Our vision: Through cultivating mindfulness, the Mindfulness Network (MN) has the intention to reduce human suffering, promote well-being

More information

System Framework for Part C and Section 619:

System Framework for Part C and Section 619: The Early Childhood Technical Assistance Center System Framework for Part C and Section 619: This component has been excerpted from the complete System Framework. Glossary terms appear underlined. The

More information

Assistant Manager, Social Work (New position)

Assistant Manager, Social Work (New position) Edmonton Catholic Schools is now accepting applications for the position of Assistant Manager, Social Work (New position) Edmonton Catholic Schools is a large urban school district whose mission is to

More information

AVAILABILITY, RESPONSIVENESS, CONTINUITY (ARC):

AVAILABILITY, RESPONSIVENESS, CONTINUITY (ARC): AVAILABILITY, RESPONSIVENESS, CONTINUITY (ARC): IMPROVING CLIENTS LIVES: RESEARCH AND PRACTICE INNOVATION VIA THE ARC MODEL OF ORGANIZATIONAL EFFECTIVENESS Anthony L Hemmelgarn, Ph.D. Children s Mental

More information

Implementing the Communities That Care Operating System

Implementing the Communities That Care Operating System Program Information Contact Blair Brooke-Weiss, MSPH Social Development Research Group University of Washington School of Social Work 9725 3rd Ave. Northeast, Suite 401 Seattle, WA 98115-2024 206-543-5709

More information

Presentation Objectives

Presentation Objectives The Impact of Organizational Social Context: Training Impacting Context or Context Impacting Training? 11th Annual National Services Training and Evaluation Symposium (May, 2008) Anthony Hemmelgarn, Ph.D.

More information

Concept Document: Triple P International Implementation Framework

Concept Document: Triple P International Implementation Framework Concept Document: Triple P International Implementation Framework Triple P International Pty Ltd. 2013 Table of Contents INTRODUCTION... 3 OVERVIEW OF TPI IMPLEMENTATION FRAMEWORK... 4 Figure 1. Triple

More information

Prevention Planning Group Restructure: HIV Planning 2.0

Prevention Planning Group Restructure: HIV Planning 2.0 Prevention Planning Group Restructure: HIV Planning 2.0 April S. Hogan, MPH Community Prevention Team Leader Florida Department of Health Bureau of Communicable Disease HIV/AIDS and Hepatitis Section,

More information

Broward County OCP2 CLAS Report REPORT MAY Lauren Acevedo Covian Consulting

Broward County OCP2 CLAS Report REPORT MAY Lauren Acevedo Covian Consulting REPORT Broward County OCP2 CLAS Report MAY 2016 Lauren Acevedo Covian Consulting 2 B roward County OCP2 CLAS Report This report documents the evaluation of 30 cultural and linguistic competence (CLC) plans.

More information

Texas Standards for High Quality Afterschool, Summer and Expanded Learning Programs Assessment Tool

Texas Standards for High Quality Afterschool, Summer and Expanded Learning Programs Assessment Tool Program: Date: Observer: Category 1: Safe Environments, Health and Nutrition: A high quality program offers a safe environment where youth have opportunities to practice healthy behaviors and have access

More information

Name of Student: Supervisor's Address (you will be sent a copy of this evaluation):

Name of Student: Supervisor's  Address (you will be sent a copy of this evaluation): Default Question Block Note that the evaluation will be sent to the student immediately so you may want to wait to submit the survey until after you have met with the student to provide verbal feedback.

More information

Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures

Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures ADAPTATION OF THE ROUTINE DATA QUALITY ASSESSMENT TOOL MEASURE Evaluation SPECIAL REPORT Botswana s Integration of Data

More information

Community Resources Assessment Training

Community Resources Assessment Training Communities That Care Community Resources Assessment Training Next Steps Participant s Guide Module 5 Module 5 Table of Contents Page Module 5... 5 3 We are here.... 5 4 Module 5 goal... 5 5 Objectives...

More information

Standards for Social Work Practice with Groups, Second Edition

Standards for Social Work Practice with Groups, Second Edition Social Work with Groups ISSN: 0160-9513 (Print) 1540-9481 (Online) Journal homepage: http://www.tandfonline.com/loi/wswg20 Standards for Social Work Practice with Groups, Second Edition Association for

More information

Application for Intensive Technical Assistance Statewide Implementation of the Pyramid Model within Preschool Programs

Application for Intensive Technical Assistance Statewide Implementation of the Pyramid Model within Preschool Programs National Center for Pyramid Model Innovations Application for Intensive Technical Assistance Statewide Implementation of the Pyramid Model within Preschool Programs Background Purpose of this RFA States

More information

DEFINING THE INNOVATION WORKSHEET

DEFINING THE INNOVATION WORKSHEET DEFINING THE INNOVATION WORKSHEET PURPOSE The Defining the Innovation Worksheet (See MEASURE Evaluation PRH Guide for Monitoring Scale-up of Health Practices and Interventions) serves to assist planners

More information

SOC Lead Family Contact. LFC Within the SOC Structure: The position of LFC in the

SOC Lead Family Contact. LFC Within the SOC Structure: The position of LFC in the TIP SHEET SOC Lead Family Contact JANUARY 2017 Hiring the Lead Family Contact (LFC) is a requirement of the System of Care (SOC) Expansion and Sustainability Cooperative Agreements with SAMHSA. It is also

More information

COMC Outcomes Management Implementation Fayette County, PA

COMC Outcomes Management Implementation Fayette County, PA COMC Outcomes Management Implementation Fayette County, PA Children s Outcomes Management Center 2005 University of Maryland, Baltimore School of Medicine 737 West Lombard Street, 5 th Floor Baltimore,

More information

IFSTAN Webinar 01/17/ :00 am. Supervision. What do the Iowa Family Support Standards require for supervision and how to document it?

IFSTAN Webinar 01/17/ :00 am. Supervision. What do the Iowa Family Support Standards require for supervision and how to document it? IFSTAN Webinar 01/17/2016 10:00 am Supervision What do the Iowa Family Support Standards require for supervision and how to document it? What is the difference between policy and procedure? A policy is

More information

PERFORMANCE QUALITY IMPROVEMENT

PERFORMANCE QUALITY IMPROVEMENT PERFORMANCE QUALITY IMPROVEMENT Stakeholder Packet 2014 3700 West Kilgore Avenue Muncie, IN 47304 Quality Assurance Department Contact Information: Director of Quality Assurance: Justin Wallen, 765.289.5437

More information

GUIDELINES for COMPLETING the BEHAVIORAL SAFETY PROGRAM DESCRIPTION and ACCREDITATION APPLICATION

GUIDELINES for COMPLETING the BEHAVIORAL SAFETY PROGRAM DESCRIPTION and ACCREDITATION APPLICATION GUIDELINES for COMPLETING the BEHAVIORAL SAFETY PROGRAM DESCRIPTION and ACCREDITATION APPLICATION The Cambridge Center for Behavioral Studies TM Commission for the Accreditation of Behavioral Safety Programs

More information

Job Description JOB PURPOSE KEY JOB FUNCTIONS. Quality Assurance Worker. DATE APPROVED: May 27, 2014

Job Description JOB PURPOSE KEY JOB FUNCTIONS. Quality Assurance Worker. DATE APPROVED: May 27, 2014 Job Description POSITION: Quality Assurance Worker ACCOUNTABILITY: Director of Human Resources CLASSIFICATION: Full-time DATE APPROVED: May 27, 2014 JOB PURPOSE Reporting to the Director of Human Resources,

More information

Washington Standards-Based Superintendent Framework

Washington Standards-Based Superintendent Framework Washington Standards-Based Superintendent Framework Standard 1 Visionary Leadership: The superintendent is an educational leader who improves and achievement for each student by leading the development,

More information

What Works with Youthful Offenders: The Characteristics of Effective Programs and the Barriers to Effective Implementation

What Works with Youthful Offenders: The Characteristics of Effective Programs and the Barriers to Effective Implementation What Works with Youthful Offenders: The Characteristics of Effective Programs and the Barriers to Effective Implementation Presented by: Edward J. Latessa, Ph.D. Professor & Director School of Criminal

More information

JOB PACK. Nepean Community & Neighbourhood Services (NCNS)

JOB PACK. Nepean Community & Neighbourhood Services (NCNS) Nepean Community & Neighbourhood Services (NCNS) 1 Nepean Community & Neighbourhood Services (NCNS) APPLYING FOR A JOB AT NCNS Nepean Community & Neighbourhood Services (NCNS) - is an energetic community-based

More information

Application for Intensive Technical Assistance Implementation of the Pyramid Model within Part C Programs

Application for Intensive Technical Assistance Implementation of the Pyramid Model within Part C Programs National Center for Pyramid Model Innovations Application for Intensive Technical Assistance Implementation of the Pyramid Model within Part C Programs Background Purpose of this RFA States are invited

More information

Botswana Power Corporation (BPC)

Botswana Power Corporation (BPC) Botswana Power Corporation (BPC) Operations Job Specification Position: Template: Technical Services: Senior Engineer (Telecommunications) Business Unit: Print Date: 25 Oct 2017 Page 1 of 8 Position Particulars

More information

Chapter Two: Assessing the Organization

Chapter Two: Assessing the Organization Chapter Two: Assessing the Organization Commonwealth of Virginia: Roadmap for Evidence Based Practices in Community Corrections 17 Commonwealth of Virginia: Roadmap for Evidence Based Practices in Community

More information

NMSU Administration and Finance HR Information Systems/Payroll Services

NMSU Administration and Finance HR Information Systems/Payroll Services REPORT ID: 1514 Introduction & Survey Framework... 1 Organization Profile & Survey Administration... 2 Overall Score & Participation... 3 Construct Analysis... 4 Areas of Strength... 5 Areas of Concern...

More information

Some Key Questions for Jurisdictions Interested in Implementing Safety and Risk Assessment Systems 1

Some Key Questions for Jurisdictions Interested in Implementing Safety and Risk Assessment Systems 1 Some Key Questions for Jurisdictions Interested in Implementing Safety and Risk Assessment Systems 1 Introduction Many states, counties, and tribal nations are redefining how they approach safety and risk

More information

AUGUST 20, 2012 EVALUATION PLAN FOR: RICHARD A. ARKANOFF SUPERINTENDENT OF SCHOOLS CENTER GROVE COMMUNITY SCHOOL CORPORATION

AUGUST 20, 2012 EVALUATION PLAN FOR: RICHARD A. ARKANOFF SUPERINTENDENT OF SCHOOLS CENTER GROVE COMMUNITY SCHOOL CORPORATION EVALUATION PLAN FOR: RICHARD A. ARKANOFF SUPERINTENDENT OF SCHOOLS CENTER GROVE COMMUNITY SCHOOL CORPORATION ADOPTED BY THE CENTER GROVE BOARD OF SCHOOL TRUSTEES ON AUGUST 20, 2012 Richard A. Arkanoff

More information

Continuous Quality Improvement Project

Continuous Quality Improvement Project Continuous Quality Improvement Project Kentucky Interview with Vincent Geremia, State CQI Coordinator, Kentucky Department for Community Based Services Vincent.Geremia@ky.gov 606-920-2007 January 20, 2012

More information

Management Plan Rubric for Priority Improvement and Turnaround Schools & Districts

Management Plan Rubric for Priority Improvement and Turnaround Schools & Districts Management Plan Rubric for Priority Improvement and Turnaround Schools & Districts This rubric is intended to guide planning for Priority Improvement and Turnaround schools and districts pursuing the management

More information

The New Discipline of Implementation

The New Discipline of Implementation Justice System Assessment & Training J-SAT Norval Morris Project Action Plan Promoting a Healing Environment in Corrections Research In Brief National Institute of Corrections The New Discipline of Implementation

More information

The Village for Families & Children Internship Goals, Objectives, and Expected Competencies

The Village for Families & Children Internship Goals, Objectives, and Expected Competencies Goal #1: Competence in working professionally with diverse individuals, groups and communities, and addressing the needs of diverse, vulnerable and underserved populations. Objective(s) for Goal #1: Interns

More information

The Village for Families & Children Internship Goals, Objectives, and Expected Competencies

The Village for Families & Children Internship Goals, Objectives, and Expected Competencies Goal #1: Competence in working professionally with diverse individuals, groups and communities, and addressing the needs of diverse, vulnerable and underserved populations. Objective(s) for Goal #1: Interns

More information

Strengthening Orphan and Vulnerable Children Programs with Data

Strengthening Orphan and Vulnerable Children Programs with Data BEST PRACTICE Creating a Culture of Data Demand and Use Strengthening Orphan and Vulnerable Children Programs with Data Significant human and financial resources have been invested worldwide in the collection

More information