10/3/17. Increasing computational power Decreasing costs for data storage Increasing data collection. Data as the Foundation of Wisdom.

Size: px
Start display at page:

Download "10/3/17. Increasing computational power Decreasing costs for data storage Increasing data collection. Data as the Foundation of Wisdom."

Transcription

1 10/3/17 Data as the Foundation of Wisdom How Data Supports the Creation of New Ways of Doing Business 1 Objectives This session will help participants: Build their data vocabulary and background knowledge Explore ways to strengthen data analysis skills Understand the critical concepts of a strong data management plan 2 Technological Trends Increasing computational power Decreasing costs for data storage Increasing data collection 3 1

2 10/3/17 Head Start Management Systems 4 Resources 5 The Changing Landscape Head Start programs should: Shift from complianceonly thinking to a culture of continuous improvement Move from being simply good stewards to demonstrating the impact of the investment in our communities 6 2

3 10/3/17 Organizational Self-Assessment Where is your program on the following features: The language of data (data literacy) Data analysis (opportunities, skills, and techniques) Data management (four elements) Story telling with data (data visualization, effective tools) 7 The Language of Data Everything should be made as simple as possible, but no simpler Albert Einstein 8 The Language of Data Data quality Data-driven decision-making Data management Data visualization Data privacy Data analysis Storytelling with data 9 3

4 !!! 10/3/17 The Culture Shift to Continuous Improvement Curiosity Reflection Tolerance for vulnerability Use of feedback Systems Thinking Moving Beyond a Culture of Compliance to a Culture of Continuous Improvement, OPRE, January Core Competencies of Organizations with a Culture of Continuous Improvement Core Competencies of Organizations With a Culture of Continuous Improvement Our organization measures outcomes (changes in participant condition, behavior or knowledge), not just efforts (quantifiable activities or services delivered). Our organization can identify which indicators are appropriate for measuring how we work. Our organization has clarity about what we want to accomplish in the short term (e.g., one to five years) and what success will look like. Our organization ensures that staff have the information and skills they need to successfully engage with data for program improvement (e.g., access to resources and training) Our organization has staff who are experienced in data collection, data use, and different stakeholders information needs. Our organization has staff who know how to analyze data and interpret what the data mean. Our organization values learning. This is demonstrated by staff actively asking questions, gathering information, and thinking critically about how to improve their work. Leaders in our organization support data use to identify areas of improvement. Our organization is capable of effectively communicating about data and results (both positive and negative) within and outside the organization. Our organization promotes and facilitates internal staff members learning and reflection in meaningful ways regarding data use, planning, implementation and discussion of findings ( learning by doing ). Our organization modifies its course of action based on findings from program data. Managers look at program data as an important input to help them improve staff performance and manage for results. Findings from program data are integrated into decision-making when deciding which policy options and strategies to pursue. Source:( Evaluation(Capacity(Diagnostic(Tool. (Informing(Change,(accessed(November(5,(2014,( pmfo@ecetta.info Tel: Change Leads to Change Change in External Environment Change in Management and Organizational Design Change in Competitive Strategy 12 4

5 10/3/17 Commitment of Resources Commit leadership time Commit staff time Finance and sustain technology 13 Linking Planning, OGM, and SA Program Planning Decide on goals Review and analyze community assessment and other relevant data Review recommendations from self-assessment report Develop long-term program goals Develop objectives Set short-term program and fiscal objectives Develop plan of action (work plan) Develop action steps for objectives Identify measures to monitor (prepare for data collection) Plan for regular progress reports to staff, GB, and PC Develop service plans assuring they reflect new goals/objectives Ongoing Monitoring Collect: Collect data (PIR, child outcomes data, results of OGM for all systems, services, goals and objectives.) Analyze: Review and analyze data with managers Act: Make course corrections Determine new data measures Ensure: Evaluate and follow up on course corrections Verify accuracy of and summarize OGM data for review by self-assessment team Request self-assessment team to analyze persistent systems issues Self-Assessment Prepare: Design self-assessment process Orient and train selfassessment participants Analyze: Analyze information presented (OGM summaries, OHS monitoring results, other info needed) Determine and request if further info is needed Recommend: Identify strengths and make recommendations for improvement and enhancement 14 Analyzing Data What we want to achieve with the data is very important. Do we recognize the biases built into our analysis? How do we allow the data to inform our decision-making? What are the questions that the data leads us to? 15 5

6 10/3/17 Integrating Data into Planning Systems Community Assessment Decide on Goals/ Communicate to Analyze data Stakeholders Share conclusions Communicate results to Ensure goals reflect internal and external conclusions from key data audiences sources, e.g., community assessment, child records Develop Plan of Action and Budget that Reflect Goals Evaluate Progress Through Self-Assessment Assess annual progress in achieving goal and objectives Assess effectiveness of systems and services Examine trends and patterns Communicate results to internal and external audiences Continually Respond with Mid-Course Corrections Use recordkeeping reporting System to collect data Check integrity of data Communicate data findings and next steps to internal audiences Determine evidence Determine data collection methodologies Adjust recordkeeping & reporting systems Implement Plan of Action Evaluate Compliance and Progress Through Ongoing Monitoring Aggregate data and review for overall trends in attendance monthly Analyze data monthly (e.g., by center, option, position, etc.) Draw conclusions Communicate findings Use Record-keeping & Reporting System to collect data Check integrity of data, e.g., supervisors do spot checks of reports Discuss reports at regularly scheduled intervals 16 The Data Wall Each small group reviews the data and answer the following questions: What stands out for you? What questions arise for you? What is clear? What is confusing? Does the data provide a clear sense of program strengths? Are there clear areas that need attention? 17 Phases of Self-Assessment 18 6

7 10/3/17 HSPPS Management of Program Data Program data should be Available Usable Honest and accurate Secure 19 Management of Program Data Consider the balance of power 20 Guidance for Data Plan Development Guidance for Data Plan Development The Head Start Program Performance Standards (HSPPS) require programs to implement a coordinated approach to management of program data to effectively support the availability, usability, integrity, and security of data. A program must establish procedures on data management, and have them approved by the governing body and Policy Council, in areas such as quality of data and effective use and sharing of data, while protecting the privacy of children s records. ( (b) (4). This resource provides guidance for development of such a plan. Guiding questions in each area provide a mechanism for identifying strengths and considerations for improvement in each of the four identified areas of data management. The narrative points provide suggestions for the writing of the data management plan. Availability data is present and ready for use Guiding Questions Has the program identified the data needed to monitor HSPPS-alignment and progress on five-year goals? Does management have access to data in a timely manner? Is data made available to the governing body and Policy Council for use in decisionmaking? Is data shared across program areas and among job classifications to enable staff to understand program operations and client needs? Are systems in place to ensure that hardware is up to date and functioning properly? Narrative Points How data is prioritized and tracked How data is shared with leadership, across programs, and with staff Processes for obtaining and maintaining software, making timely reports, and backing up data Usability the extent to which data can be used effectively and efficiently Usability also refers to the ease with which software and web applications can be used to achieve desired goals. Guiding Questions Are the software and web applications tracking the data programs need? Is staff able to use the software and web applications to meet their job responsibilities? Are child and family intake forms, surveys, and other recordkeeping documents coordinated with the software and web applications? Is data gathered and shared in a timely manner? Are reports accurate, appealing, accurate, accessible, and audience-specific? 21 7

8 10/3/17 Telling Your Story Well Use data as an asset Use existing tools for reporting Stories bring us together and move us to action Know your audience 22 A Tool Belt of Techniques Personal Stories Data Sculptures Participatory Games Creative Maps Charts and Graphs 23 Effective Reports Follow the Four A s Appealing Accessible Accurate Audience-Specific 24 8

9 10/3/17 What Are the Next Steps in Your Data Journey? Knowing that you don t know something Does that mean that you know something? 25 Contact PMFO pmfo@ecetta.info hslc/tta-system/operations Call us:

10 CONCEPTUAL ELEMENTS of Continuous Quality Improvement Commitment of Resources Organizational Characteristics History of improvements Program characteristics Size Structure Leadership in Data Management Be Transformational Adopt and Lead Change Strategy Communicate Clearly Motivate for Innovation and Creativity Distribute Responsibilities Be a Role Model Commit, finance, and sustain technology Commit leadership time Commit staff time Culture of Collaborative Inquiry Promote systems thinking Share learning Engage partners Create safe space Professional Development Analytic Capacity Assess technological capital Assess human capital Assess data capital Management of Program Data Security of data Usability Integrity Availability Quality Child & Family Outcomes Understand data systems Develop analytic capacity Integrate knowledge and beliefs Environment Source: Quality Improvement: What the Head Start Field Can Learn From Other Disciplines, A Literature Review and Conceptual Framework. U.S. Department of Health and Human Services. Nongovernmental funders Government mandates Accreditation, licensing and professional systems Time

11 Organizational Readiness: A Program Assessment (B) As Head Start and Early Head Start programs explore the shift to becoming more systemsfocused and using data for continuous improvement, this chart below provides a tool for reflecting on proficiency in four related areas: A) The Language of Data or Data Literacy; B) Data Analysis; C) Goals and Objectives; D) Telling our Story. Proficiency levels are measured along a five-point continuum. The chart also provides explanatory notes for each of the four areas. The second part of the handout allows leaders to summarize the results and consider next steps in enhancing their organization s proficiency levels. Use the chart to identify your organization s level of proficiency with each of the areas. Proficiency Level Scale 1 In the early stages of development 2 Limited capacity and experience 3 Organization is fairly skilled 4 Organization is adept in this area and is able to give some good examples 5 Very effective, high level of capacity, intentional and invests resources including time THE LANGUAGE OF DATA (Data Literacy) We communicate internally and externally about the data collected and specific findings. We provide regular staff training on data systems including terminology. We recognize which data is best for measuring the efficacy of the services delivered. Staff are able to critically think about the meaning of data DATA ANALYSIS We thoughtfully review our data and reflect on how it informs us. Staff have the capacity to analyze and understand what the data says. Staff understand (1) what appropriate data are, (2) how to analyze and make meaning from the data, and (3) how to use that information to improve the quality of their work. Data are used to make decisions on improving the quality of work. Staff are able to use data to forecast expected outcomes and expected challenges GOALS AND OBJECTIVES Goals are BROAD and responsive to community needs. Parents and the governing body are involved in the development of the goals. Objectives are SMART (specific, measurable, attainable, realistic and timely). Objectives are subparts of the goals and are achievable in the short term

12 Proficiency Level Scale 1 In the early stages of development 2 Limited capacity and experience 3 Organization is fairly skilled 4 Organization is adept in this area and is able to give some good examples 5 Very effective, high level of capacity, intentional and invests resources including time TELLING OUR STORY Data is represented in visually appealing ways to provide information to others. We use electronic tools to visually depict our data. We consider our data an asset in telling our story (trust in the data). We effectively blend qualitative and quantitative data. Data is used to demonstrate the impact made in the lives of children and families ) In which area is your organization most proficient? Why? 2) Which area is the most challenging for your organization? Why? 2

13 Action Plan for the Four Areas After reflecting on the first two questions, consider the next steps your organization can take to increase its capacity in each area Action Person Responsible Timeframe The Language of Data Data Analysis Goals and Objectives Telling our Story 3 pmfo@ecetta.info Tel:

14 Guidance for Management of Program Data The Head Start Program Performance Standards (HSPPS) require programs to implement a coordinated approach to management of program data to effectively support the availability, usability, integrity, and security of data. A program must establish procedures on data management, and have them approved by the governing body and Policy Council, in areas such as quality of data and effective use and sharing of data, while protecting the privacy of children s records ( (b)(4)). This resource provides guidance for development of these procedures. Guiding questions in each area provide a mechanism for identifying strengths and considerations for improvement in each of the four identified areas of data management. The narrative points provide suggestions for the writing of data management procedures. Availability data is present and ready for use Guiding Questions Has the program identified the data needed to monitor HSPPS-compliance and progress on five-year goals? Does management have access to data in a timely manner? Is data made available to the governing body and Policy Council for use in decision making? Is data shared across program areas and among job classifications to enable staff to understand program operations and client needs? Are systems in place to ensure that hardware is up to date and functioning properly? Narrative Points How data is prioritized and tracked How data is shared with leadership, across programs, and with staff Processes for obtaining and maintaining software, making timely reports, and backing up data Usability the extent to which data can be used effectively and efficiently Usability also refers to the ease with which software and web applications can be used to achieve desired goals. Guiding Questions Are the software and web applications tracking the data programs need? Is staff able to use the software and web applications to meet their job responsibilities? Are child and family intake forms, surveys, and other recordkeeping documents coordinated with the software and web applications? Is data gathered and shared in a timely manner? Are reports accurate, appealing, accessible, and audience-specific? pmfo@ecetta.info Tel:

15 Narrative Points Software and web applications used for tracking information How staff are trained How the program ensures that staff properly use data collection systems Timelines and responsibilities for gathering and compiling data Integrity refers to the accuracy and consistency of data over its entire life cycle Guiding Questions Does the organization have policies and procedures in place to ensure that data are accurate, complete, timely and relevant to the program and stakeholders? Are there strategies to minimize data entry errors? Do data elements all share the same meaning at all locations? For example, is the age of children served reported consistently across all centers for the PIR? Are systems in place to guard data from accidental or malicious access, including breaches or attacks? Narrative Points Policies and procedures for ensuring that data is accurate and of high quality (data entry, user requirements and procedures) Security protecting data, such as a database, from destructive forces and from unwanted actions Guiding Questions Are there policies and procedures that ensure that personally identifiable information (PII) and other data are only accessible to authorized personnel? Do policies and procedures address the secure transfer of data? Are there varying levels of access to data based on job responsibilities? Are data security technologies in use, such as disk encryption, backup systems, data masking, and data erasure? Narrative Points Policies and procedures that ensure that PII and other data are only accessible to authorized personnel Management responsibilities for data security pmfo@ecetta.info Tel:

16 Understanding the Conceptual Elements of Continuous Quality Improvement The Office of Program, Research, and Evaluation (OPRE) LEADS study conducted a multidisciplinary review of the literature on the processes, facilitators, and impediments to data use for continuous quality improvement (Derrick-Mills et al. 2014). The key principles that emerged helped to identify the elements that facilitate or impede the process of data use for continuous quality: leadership, commitment of resources, analytic capacity, professional development, a culture of collaborative inquiry, a cycle of continuous quality improvement, organizational characteristics, and the environment. The Head Start National Center on Program Management and Fiscal Operations has created The Continuous Quality Improvement Conceptual Elements graphic to help program leadership identify and work toward the creation of a culture of continuous quality improvement. The Conceptual Elements graphic also incorporates the requirement for Head Start grantees to establish procedures on data management. IMPLEMENTATION OF THE CONCEPTUAL ELEMENTS 1. Leaders must be strong, committed, inclusive, and participatory. The most important element identified was strong leadership. Specifically, leaders must be transformational, change agents, able to communicate expectations around data use, motivate innovation and creativity, and willing and able to distribute responsibilities. Leaders must act as role models and communicate clear expectations around data use. Strong leadership ensures the organization has the required resources, analytic capacity and professional development to use data. The absence of any of these elements is likely to reduce an organization s ability to continuously and successfully use data for continuous quality improvement. 2. Leaders must prioritize and commit time and resources to the data-use effort. Leaders must not only possess certain characteristics but also demonstrate their commitment to data use for continuous quality improvement by channeling resources to support and sustain technology devoting their time to these efforts allocating staff time to support data collection, analysis, and use 3. Leaders invest in professional development. Leaders also must invest in professional development for staff to facilitate their capacity to: Analyze and understand the meaning of the data Understand the data systems in which the data is stored Integrate new knowledge to effectively use the information

17 4. Analytic capacity is necessary and should not be assumed. Analytic capacity includes the appropriate data, appropriate technology, and staff capacity. Appropriate data include quality observations, information, and numbers that may be sorted and aggregated to provide meaningful insights. Appropriate technology supports efficient data collection, secure data storage, data sorting and aggregating, and appropriate data analyses. Human capacity refers to the extent to which the staff understand (1) what appropriate data are, (2) how to analyze and make meaning from the data, and (3) how to use that information to improve the quality of their work. 5. An organizational culture of learning facilitates continuous data use. Leadership must foster and support a culture conducive to collaborative inquiry where staff learn together in an environment that fosters joint problem solving, creativity and innovation. A learning culture is evidenced by a safe space where staff can openly discuss whatever the data might reveal about program operations and outcomes good or bad without fear of reprisal. Learning cultures also create opportunities for shared learning where staff can discuss data together to determine what the data mean and what to do about it. Learning cultures work to involve both staff and stakeholders, typically clients, in making sense of the data and determining where to focus improvement efforts. 6. The environment matters. It, too, is complex and dynamic. The processes and foundational factors occur within an organization but are influenced by the surrounding context, which includes both organizational characteristics and an organization s environment. An organization s environmental characteristics refer to all of the factors that influence it, such as governmental mandates and regulations at the federal, state, and local levels; licensing, accreditation, and professional systems; nongovernmental funders (such as Foundations); and time (the continual changing of the regulations, expectations, and other elements of the environment to which the organization must respond). It is important for organizations to recognize and plan for the environmental conditions that can help or hinder their ability to use data. 7. Organizational characteristics influence processes and systems. Organizational characteristics include size; governance structure; the types of programs the organization operates; and history, including the history of related data, planning, and continuous improvement efforts. The size of the organization and the history of quality improvement efforts are two of the main characteristics affecting efforts to implement or sustain data use for continuous improvement. The literature suggests that the history of previous data-use efforts can have either positive or negative effects on current data use efforts. 2

18 8. Implement sound data management procedures. In accordance with the HSPPS (b) (4) programs must have data management procedures to effectively support the availability, usability, integrity, and security of data. A program must establish procedures on data management, and have them approved by the governing body and Policy Council, in areas such as quality of data and effective use and sharing of data, while protecting the privacy of child records. (See PMFO resource Guidance for Management of Program Data.) DATA USE FOR QUALITY IMPROVEMENT IS A CONTINUOUS PROCESS Effective data use to improve quality involves a continuous, cyclical process of goal-setting, data collection, data examination, and data-informed action. Reflecting on organizational and program goals, data users identify key data elements and the questions they want to address. They collaboratively analyze the data and interpret the findings. Through the expertise and experience of data users, the information becomes knowledge. That knowledge tells users how the program is performing. It identifies areas of strength, enabling the program to replicate these strategies and identifies areas of the program that need improvement. Areas needing improvement are prioritized to create a concrete action. During implementation, observations and data are fed back into the continuous improvement loop so progress toward goals and performance objectives can be monitored. Progress and quality are evaluated against internal goals or external benchmarks. The end of every cycle is the beginning of a new cycle. The Head Start Program Planning Cycle (depicted below) demonstrates an on-going cycle of planning, implementation and evaluation in Head Start, driven by data that promotes continuous quality improvement and allows programs to work toward the achievement of positive outcomes for children and families. Understanding the conceptual elements that support continuous quality improvement can help programs work toward quality outcomes for children and families. Source:Derrick-Mills, Teresa,Heather Sandstrom, SarahPettijohn, Saunji Fyffee, and Jeremy Koulish. (2014). Data Use for Continuous Quality Improvement: What the Head Start Field Can Learn from Other Disciplines, A Literature Review and Conceptual Framework. OPRE Report # Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families. U.S. Department of Health and Human Services pmfo@ecetta.info Tel: