Wednesday 4.26 Session B: Transforming Residential Intervention Through Outcomes Design

Size: px
Start display at page:

Download "Wednesday 4.26 Session B: Transforming Residential Intervention Through Outcomes Design"

Transcription

1 Wednesday 4.26 Session B: Transforming Residential Intervention Through Outcomes Design Measuring a Vision Sarah Morrill, MSW Director of Outcomes and Evaluation Plummer Home smorrill@plummerhome.org Joshua Metcalfe Group Home Program Director Plummer Home jmetcalfe@plummerhome.org 1

2 Goal for Today: Increase your understanding of the outcomes design process as a means to implement the clinical model. 2

3 Agenda I. Introductions II. III. Framing the Conversation Introduction to Outcomes IV. The Data Informed Cycle Design, Implementation and Staff Experience V. Questions 3

4 We Are Plummer Home, est Vision: Every young person has a family unconditionally committed to nurture, protect and guide them to successful adulthood. Who we serve: young men and women, ages 0-22 Why they are with us: child welfare or juvenile justice Where we partner with them: residential facility, at home, in the community 4

5 5

6 Quick Activity Take 2 minutes to sketch a picture of a car begin with the end in mind ~ Stephen Covey 6

7 Strategic Learning Strategy Evaluation Learning for social impact Strategic learning is the use of data and insights from a variety of information-gathering approaches including evaluation to inform decision-making about strategy. ~ Coffman and Beer. (2011) 7

8 OUTCOMES: Why do we do what we do? Focusing on results increases effectiveness Having a target drives performance And drives investment of resources Planning and doing become inseparable Innovation is encouraged It's just the right thing to do! 8

9 Terminology Inputs: resources used to implement our work. What is essential to our work? Activities: actions associated with delivering goals. Who does what? Outputs: amounts associated with actions. How much was done? Outcomes: meaningful change for those served (knowledge, skill, attitude)as a consequence of the work. How well did we do it? Indicators: specific, observable, measurable characteristics that show progress is being made. How do we know change is happening? Benchmark: best practice standard against which performance is assessed. What are the things that matter? Impact: long term consequence of the work. How long did it last? 9

10 Developing an Outcome-Informed Program System Pre-Requisites Data Governance 10

11 Data-Informed Cycle Data Analysis, Reporting & Use Vision, Theory of Change, Priorities Program- Goals & Logic Models Training & Support Key Indicators Technical Processes & Tools Target Outcomes 11

12 Building a Data Culture: Stages 3. IMPLEMENT -Train & Support -Analyze, Report, Use 2. OPERATIONALIZE -Indicators -Outcome Targets -Tools 4. LEARN & ADAPT! 1. DESIGN -Vision, Theory of Change and Priorities -Program Goals & Logic Models Design Stage 12

13 Step 1: Vision, Theory, Priorities (Align Mission & Resources) Examine your organization. Why do we do what we do? For whom? How do we design programs to meet goals? How do we know if we are doing what we said we would? Are we innovating? Are we maximizing resources? You know. what you want to have happen your clients your service strengths the barriers You have a strategic plan stakeholders engaged staff 13

14 Detail a Theory of Change Mission Knowledge If then statement (or reverse) Staff and stakeholder input Rooted in strategic plan If. we do this, in these ways. then the impact will be. 14

15 SAMPLE: Plummer Home Intervention Model 15

16 OTHER TOC SAMPLES I 16

17 OTHER TOC SAMPLES I 17

18 Step 2: Programs Goals & Logic Models Need Inputs Activities* Outputs Outcomes Impact (problem in the community) (resources used) (actions associated with service delivery) (amounts related to actions) (change in knowledge, belief, attitudes) (long term consequence) NEEDS ASSESSMENT DEVELOPMENTAL EVALUATION COST BENEFIT ANALYSIS PROCESS EVALUATION FORMATIVE EVALUATION QUALITY IMPROVEMENT OUTCOMES EVALUATION SUMMATIVE EVALUATION IMPACT EVALUATION RESEARCH RCT STUDY * Assign by accountability, responsibility & support 18

19 Steps 1&2: Staff Perspective Step: Strengths Challenges 1. Vision, Theory & Priorities 2. Apply Model Across Programs Staff are passionate about the kids & want to have a noticeable impact Want to be included in strategic activities Reinforces the intervention model across programs Assists with clinical supervision Internal educational tool Visualizing long term success for youth in crisis Finding time to meet Recognizing the link between organizational strategy and daily activities Often going from crisis to crisis so hard to focus interventions Requires continual reinforcement Inconsistent across programs due to lack of time and funds 19

20 3. IMPLEMENT -Train & Support -Analyze, Report, Use 2. OPERATIONALIZE -Indicators -Outcome Targets -Tools 4. LEARN & ADAPT! 1. DESIGN -Align Mission & Resources -Program Goals & Logic Models Operationalize Stage 20

21 Step 3: Identify Key Indicators (Behavioral Evidence ) All inclusive process (all staff, all levels) Resources dedicated to the work (inputs) What people do every day (activities) What changes in attitude, knowledge and behavior you hope to engender (indicators) What you already count (outputs) Results of change in knowledge, attitudes, beliefs (outcomes) 21

22 Step 4: Establish Outcome Targets Funders seek measureable return on investment Managers want to know how we spend our resources Directors want to know if it has an impact Staff want to see results that matter to youth Participants want to have a say in their future Outcomes help us define and verify success so we can have more of it with those we serve. 22

23 Step 5: Build the Tools Balance organizational goals with compliance and funding requirements Research what others are using Build evaluative tools (assessments, surveys, pre/post) or select evidence-based measures Select and build the technical tools Remove as much duplication as possible (workflow) Customize to meet your desired impact Identify the resources needed and communicate with executive board Embed within organization- budget impact over time, human resource requirements, training and support 23

24 How We Collect Data Cloud-Based Software - case management, client tracking and outcomes management; converted required documentation to Apricot Daily Progress Note tracks interventions & behaviors Plummer Home Goal Sheet with Matrix tracks goal attainment and outcomes (individual and aggregate) Plummer Home Permanency Assessment measures advancement in key domains Discharge Form tracks legal status, location & relationship Youth & Parent Adjustment and Satisfaction Survey - post discharge survey to measure impact 24

25 Plummer Measurement Matrix: SAMPLE Outcome Area: PERMANACY Domain 1; Family & Parenting 1 (At Risk, Dependent & Unaware) 2 (Stable, Dependent & Contemplative) 3 (Safe, Appropriately Independent & Aware) 4 (Thriving, Interdependent & Aware) Domain 2: Safe & Stable Living w/faimly program-specific indicators 25

26 Goal Sheet w/matrix Reinforces Treatment Planning Uses PH rating scale Encourages SMART goals Allows for task assignment & updates Tracks individual progress outcomes progress 26

27 Daily Progress Note: Residential Step 2 Step 1 Step 3 27

28 Steps 3 (Indicators) &4 (Targets): Staff Perspective Step Strengths Challenges 3. Indicators (Behavioral Evidence) Connects activities to goals Shifts focus to youth behavior vs. staff activities Improves over time Shifting priorities Training, re-training & supervision Adequate technology 4. Outcome Targets Focuses efforts Allows for the natural ups and downs of treatment Takes more time to develop and document Establishes a commitment ( What if we can t meet it?! ) We do not have ultimate authority over the kids 28

29 Step 5 (Tools): Staff Perspective Strengths Improved documentation Quick access to information Visualizes progress Challenges This is too much! Don t have resources (tech., staff, time, IT support, etc.) Often in crisis mode Verifies skill building model TIME Professionalizes less clinically trained staff 29

30 3. IMPLEMENT -Train & Support -Analyze, Report, Use 2. OPERATIONALIZE -Indicators -Outcome Targets -Tools 4. LEARN & ADAPT! 1. DESIGN -Align Mission & Resources -Program Goals & Logic Models Implement Stage 30

31 Step 6: Train & Support BUY-IN Relationships are key Be available, flexible, responsive Be humble Train staff Technical and evaluative tools internal content experts, find your allies Ongoing support for documentation system/database Link to HR once resources align competencies, supervision, etc. accountability Use data to build your case (& that of others) 31

32 Step 6 (Train & Support): Staff Perspective Strengths All held to same accountability Challenges Don t have time Able to see results of work with kids Able to see evidence of clinical training Don t have resources (tech., staff, time, IT support, etc.) New staff are hired all the time so constant training is required 32

33 Step 7: Analyze, Report, & Use Data Processes Quality assurance Data validation Reporting cycles Systems and structures Ongoing support (technical, training) Program Processes Plan Do Study Act (PDSA) cycles Program- Treatment or activities Organization- TOC & logic model Individual- Supervision & Competencies Learn and adapt 33

34 Sample Reports Audience Direct Care Staff Supervisors State Supervisors Program Managers Board of Directors Supervisors Program Managers Board of Directors Funders State Direct Care Staff Supervisors Program Managers Board of Directors Funders State Report Type/Theme Treatment Goals Fidelity to Treatment Model Quality Assurance Advocacy Research Individual Outcomes Program Outcomes 34

35 Sample Clinical & Quality Assurance Report 35

36 Sample Supervisory Report 36

37 Treatment Planning (Popular with Clinical Staff) 37

38 Goal Attainment (Popular with Supervisors) 38

39 Global Outcomes Report (Popular with Program Directors) 39

40 Program Specific Newsletters How well did we do (OUTCOMES) for both individuals and as programs? How satisfied are youth & families once they have worked with us? (SATISFACTION) How much did we do? (OUTPUTS) What difference did it make? (IMPACT) How true to the model are our practices? (FIDELITY) How do we compare? (COMPARISON) 40

41 Step 7 (Data Use): Staff Perspective Strengths Reinforces individual goals Shifts focus to youth behavior vs. staff activities Identifies link between interventions and behaviors Challenges Takes more time Training, re-training and supervision are required Adequate technology is required Takes our beliefs, tracks them as actions and provides evidence 41

42 Model Implementation Data Knowledge: Where We Are Staff trained in intervention model & clinical components Periodic mandatory refresher trainings are offered Data Collection Apricot fully implemented on one program area, testing: Data Governance theory of change fidelity to clinical model Forms built on Apricot for all programs Developing program-specific indicators in others (Matrices) Building survey capacity (pre/post) Building a data-informed program Data expectations, data agreements Quality Assurance systems 42

43 So where do you go from here? Turn & Talk Talk about TWO things you will do immediately to work toward using outcomes to drive a clinical culture of learning. Create an action plan for yourself Share out highlights 43

44 Questions? 44

45 Thank you for joining us! Sarah Morrill, MSW Director of Outcomes and Evaluation Plummer Home Joshua Metcalfe Group Home Program Director Plummer Home 45