CTSA Program Common Metrics Initiative

Size: px
Start display at page:

Download "CTSA Program Common Metrics Initiative"

Transcription

1 CTSA Program Common Metrics Initiative PI Overview: Maximizing CTSA Program Impact Philip L. Lee Clear Impact Based on the principles of Results-Based Accountability from Trying Hard is Not Good Enough: How to Produce Measurable Improvements for Customers and Communities by Mark Friedman (Trafford 2005)

2 GoToWebinar Overview Control panel o Opening/closing o Muting o Joining with audio Submitting questions/comments via questions panel Raising your hand for verbal questions Session recording

3 PIs will be able to: Common Metrics Initiative PI Overview Objectives 1. Explain how the Common Metrics will be used with Results-Based Accountability (RBA) and the Scorecard software in the CTSA Program. 2. Describe the steps of Turn-the-Curve Thinking. 3. Explain the principle of first asking Are we doing the right things? and how it will be applied with the Common Metrics and RBA. (c) Results Leadership Group, LLC 3

4 1. How the Common Metrics will (and will not) be used 2. Results-Based Accountability a. Turn-the-Curve Thinking b. Ask first, Are we are doing the right things? c. Performance measures for strategies d. Transparency: the good, the bad, and the ugly 3. Scorecard Software Common Metrics Initiative PI Overview Agenda

5 How will the Common Metrics be used (with RBA and the Scorecard software)? 1. To define the desired impact of CTSA Program: A skilled CT research workforce Improved translation: more & better treatments & cures get to more patients faster & at less cost 2. To gauge the extent to which that impact is occurring: % of K & T scholars who become CT researchers Improved translation speed, cost, quantity & quality Rates of innovation develop, demonstrate, disseminate, and uptake in CT research workforce development and translation

6 3. To focus decision making at all levels on maximizing the impact of the CTSA Program. 4. To increase collaboration in management of the Program: Within each hub Between hub leadership and the hub program officer Across the Network among the hubs and between the hubs and NCATS 5. To increase the rigorous use of data and analysis in the management of the Program. 6. To have a common framework and software platform for collaboration and communication in the management of the Program.

7 The Common Metrics will not be used to: 1. Answer questions of contribution/attribution: Why scholars did (or did not) become successful researchers Why translation did (or did not) improve 2. Define, assess or compare performance: Performance can only be defined, assessed or compared when all of the factors that influence a measure are known and considered. The Common Metrics will be inherently insufficient to define, assess, or compare performance.

8 Common Metrics Initiative PI Overview 1. How the Common Metrics will (and will not) be used 2. Results-Based Accountability a. Turn-the-Curve Thinking b. Ask first, Are we are doing the right things? c. Performance measures for strategies d. Transparency: the good, the bad, and the ugly 3. Scorecard Software

9 Apply to the designated Common Metric Performance Measures 1. Median IRB review duration 1. Pilot funding publications 2. Careers in clinical and translational research KL2 or TL1 underrepresented populations women

10 Answer these questions End to Turn-the-Curve Thinking 1. How Where are are we doing? headed? Historic Forecast 2. What is the story behind the curve? 3. Who are partners that might have a rolplay in turning the curve? 4. What would work to turn the curve? Means 5. What are our strategies to turn the curve? 10

11 Turn-the-Curve Plan 1. How are we doing? 2. What is the story behind the curve? 2. What are our strategies to turn the curve? 11

12 PM % of studies that achieved accrual goal within time specified in study design

13 PM PM % of studies that achieved accrual goal within time specified in study design Current Value: Q % How are we doing (historic baseline)?

14 PM PM % of studies that achieved accrual goal within time specified in study design What is the story behind the curve? Non-investigator clinicians not screening patients for potential study participants during clinic consultations. (See: CTSI report) Lack of systems for data-driven cohort discovery in planning clinical studies Inadequate knowledge/skills in recruitment planning/implementation Study designs that impede participation Sites inconveniently located. Current Value: Q %

15 The Story Behind the Curve Root Causes (ask Why? five times) Positive and negative current and anticipated internal and external Prioritize Which are the most important to address to turn the curve? Do we need additional data/analysis?

16 PM PM % of studies that achieved accrual goal within time specified in study design OK? Current Value: Q % Forecast (if we do nothing different)

17 PM PM % of studies that achieved accrual goal within time specified in study design Partners? University/Medical Center Leadership University Communications School IT staff, Other Hubs Education and training department What would work? Current Value: Q Develop strategies to motivate non-investigator clinicians to screen patients. (Research: German Heart Center Munich) Social media (See: erecruitmentwhitepaper.pdf) IT Solutions: Enterprise cohort discovery database. Cohort discovery tools. Provide education to inexperienced investigators in how to develop and implement recruitment strategies. 50 %

18 PM PM % of studies that achieved accrual goal within time specified in study design Strategies to turn the curve? What we are going to do to turn the curve. Current Value: Q % Research, develop, and institute targeted strategies to motivate non-investigator clinicians to screen patients. Engage institutional leadership. Develop and implement social media strategies. Partner with the communications school.

19 PM PM % of studies that achieved accrual goal within time specified in study design Current Value: Q %

20 PM PM % of studies that achieved accrual goal within time specified in study design What is the story behind the curve? Current Value: Q % Targeted clinician motivation program See EvaluationofTargetedClinicianMotivationProgram.docx Social media campaigns See Evaluationof2016RecruitmentSocialMediaCampaigns.docx

21 PM PM % of studies that achieved accrual goal within time specified in study design Strategies to turn the curve? Current Value: Q % Expand clinician motivation program to all clinicians. Increase number of trials using social media campaigns.

22 CTSA Network/Hub/Division/Core/Team Agenda 1. New data Reasons to change strategies? 2. New story behind the curve Reasons to change strategies? 3. New partners Reasons to change strategies? 4. New information on what works Reasons to change strategies? 5. Changes to strategies 6. Adjourn Rigorous and nimble. Not rote not a cycle that must be followed step-by-step.

23 Turn-the-Curve Plan Updated 1. How are we doing? 2. What is the story behind the curve? 2. What are our strategies to turn the curve? 23

24 Common Metrics Initiative PI Overview 1. How the Common Metrics will (and will not) be used 2. Results-Based Accountability a. Turn-the-Curve Thinking b. Ask first, Are we are doing the right things? c. Performance measures for strategies d. Transparency: the good, the bad, and the ugly 3. Scorecard Software

25 Applying Results- Based Accountability People and their managers are working so hard to be sure things are done right, that they hardly have time to decide if they are doing the right things. First, decide the right things to do. Stephen R. Covey And do both with Turn-the- Curve Thinking Second, decide how to do each of those things right.

26 I sure am glad we don t have that problem!! 26

27 Sole Accountability When performance Sole measures Accountability are only used for individual units (e.g., cores, departments) they can lead to a fragmented or stove piped culture.

28 Use the Common Metrics + turn-the-curve thinking Joint Accountability to help foster a hub-wide perspective and joint accountability. 28

29 Joint accountability for the whole... Use the Common Metrics to first ask Are we doing the right things?... sole accountability for the parts? so we don t end up doing the wrong things right.

30 Hub Accrual Rates What is the story behind the curve? 1. Non-investigator clinicians not screening patients for potential study participants during clinic consultations. 2. Lack of systems for data-driven cohort discovery in planning clinical studies 3. Inadequate knowledge/skills in recruitment Doing the right things? planning/implementation 4. Study designs that impede participation 5. Sites inconveniently located. What are our strategies to turn the curve? 1. Research, develop, and institute targeted strategies to motivate non-investigator clinicians to screen patients. 2. Develop and implement social media strategies. Priority

31 Common Metric Performance Measure Turn-the-Curve Plan Turn-the-Curve Thinking 1. How are we doing? 2. Story behind the curve? 3. Partners? 4. Options? 5. Strategies/Actions? Strategy 1 Strategy 2 Strategy 3 Action 1 Action 2 Performance Measures

32 Common Metric Performance Measure Strategy 1 Strategy 2 Strategy 3 Action 1 Action 2 TTC Plan TTC Plan TTC Plan Implement Action Implement Action

33 Common Metrics Initiative PI Overview 1. How the Common Metrics will (and will not) be used 2. Results-Based Accountability a. Turn-the-Curve Thinking b. Ask first, Are we are doing the right things? c. Performance measures for strategies d. Transparency: the good, the bad, and the ugly 3. Scorecard Software

34 POPULATION PERFORMANCE Result Indicator Performance Measure Definitions A condition of well-being for children, adults, families or communities. Babies Born Healthy, Safe Communities, Clean Environment, Free from Death and Suffering Due to [disease] A measure which helps quantify the achievement of a result. Rate of low birth weight babies, Crime rate, Air quality index, Mortality and morbidity rates for [disease] A measure of how well a program, agency, or service system is working. Three types: (Language Discipline) 1. How much did we do? 2. How well did we do it? 3. Is anyone better off? = Customer Results

35 Selecting Headline Performance Measures for Strategies Effect Effort Quantity How much service did we deliver? How much change / effect did we produce? Quality How well did we deliver it? What quality of change / effect did we produce? 35

36 To manage strategies Effect Effort Three Kinds of Performance Measures Quantity How much did we do? Is anyone better off? Quality How well did we do it? # %

37 Hub Classroom Programs Effect Effort # Students Quantity How much did we do? # class hours Is anyone better off? Quality How well did we do it? % courses/classes rated high on presentation quality % high ratings for accessibility of courses in the program #/% students with demonstrated mastery of topic/skill #/% students rating class as providing useful knowledge/skills

38 Consulting Services Effect Effort Quantity How much did we do? # persons served # hours consulting provided Is anyone better off? Quality How well did we do it? % staff with PhD or equivalent % researchers rating consultation high on timely/accessible #/% researchers rating consultation as high on added value / met my need #/% of research projects with consultation rated high on value that were impactful in their field

39 Innovation in Workforce Development or Translation Develop, Demonstrate, Disseminate (D, D & D) Quantity Quality Effect Effort How much did we do? # novel concepts developed $ budget for innovation # FTEs involved in innovation Is anyone better off? How well did we do it? % concepts demonstrated % concepts disseminated % innovation with significant collaboration beyond (1) department and (2) hub #/% concepts implemented beyond hub (e.g., in other hubs) #/% innovations resulting in significant demonstrated impact on workforce development or research translation

40 Common Metrics Initiative PI Overview 1. How the Common Metrics will (and will not) be used 2. Results-Based Accountability a. Turn-the-Curve Thinking b. Ask first, Are we are doing the right things? c. Performance measures for strategies d. Transparency: the good, the bad, and the ugly 3. Scorecard Software

41 Google: Project Aristotle* New research reveals surprising truths about why some teams thrive and others falter. 1. Equality of conversational turn-taking 2. Psychological safety * New York Times Magazine, February 28, 2016

42 Together build a culture that does not discourage but encourages the use of measures/data and transparency in the management of the CTSA Program.

43 % Job Training Accrual Trainees RatesPlaced in Jobs 100% 0% What s the story behind the curve? Yr1 Yr2 Yr3 Yr4 Yr5 Yr6 Yr7

44 Getting to Our Best Thinking Story behind the curve? Partners? What works? 44

45 Implicit Assumptions How to (1) Identify and (2) break with?

46 Discussion vs. Dialogue To tell, sell, persuade, decide To justify/defend assumptions To inquire to learn To uncover and examine assumptions I wonder which of these is the right one? I wonder how these pieces combine to create a whole? In dialogue, individuals gain insight that simply could not be achieved individually.

47

48 Use performance measures and turn-the-curve thinking and transparency to ask and answer the most important questions for maximizing the impact of the CTSA Program.

49 Common Metrics Initiative PI Overview 1. How the Common Metrics will (and will not) be used 2. Results-Based Accountability a. Turn-the-Curve Thinking b. Ask first, Are we are doing the right things? c. Performance measures for strategies d. Transparency: the good, the bad, and the ugly 3. Scorecard Software

50 RBA and Scorecard Hub Common Metric Performance Measure Turn-the-Curve Plan for a Common Metric Performance Measure CMPM Strategy Performance Measure Strategies Turn-the-Curve Plan for a Strategy Performance Measure Performance Measure Strategies

51 1. Common Metrics Scorecard List of all of the Common Metrics and associated Common Metric Performance Measures

52 2. Common Metric Page

53 2. Common Metric Page: Operational Guidelines Data Collection Notes Associated measures 3. Turn-the-Curve Page

54 3. Turn-the-Curve Page

55 3. Turn-the-Curve Page For a Common Metric Performance Measure Enter your Turn-the- Curve Plan

56

57 A communication tool: share, engage, learn Allow access (read only, comment, edit) Export (reports, pdf, excel ) Imbed (Web pages, s,...) Before meetings During meetings After meetings Between meetings Instead of meetings

58 Questions?