EMT Associates, Inc. Approach to Conducting Evaluation Projects

Similar documents
Evaluation, Evaluators, and the American Evaluation Association

Evaluation Policy for GEF Funded Projects

Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

EBIP Concentration Suggested Assignments

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

Peer Review of the Evaluation Framework and Methodology Report for Early Learning and Child Care Programs 2004

The Seven Areas of Responsibility of Health Educators Area of Responsibility I: ASSESS NEEDS, ASSETS AND CAPACITY FOR HEALTH EDUCATION COMPETENCY

NOGDAWINDAMIN FAMILY AND COMMUNITY SERVICES

Workshop II Project Management

An Evaluation Roadmap for a More Effective Government

C-18: Checklist for Assessing USAID Evaluation Reports

REQUEST FOR QUALIFICATIONS TO EVALUATE:

Implementation Research: A Synthesis of the Literature

Transformation in Royal Mail

Competence by Design-Residency Education: A Framework for Program Evaluation

Organizational Leadership Suggested Field Assignments

Sustainability Planning Workbook Implementation Guide For Lifespan Respite Grantees and Partners

AREA I: ASSESS NEEDS, ASSETS, AND CAPACITY FOR HEALTH EDUCATION

Planning & Implementation Guide

Conceptual Framework and Process for Strategic Planning

Technical Note Integrating Gender in WFP Evaluations

Return on Investment. E x e c u t i v e s u m m a r y

EMT Associates, Inc. Approach to Conducting Technical Assistance and Training Projects

The Change Challenge: Realizing the Full Value of Your Business Initiatives

FRAMEWORK FOR POLICY DEVELOPMENT

STAKEHOLDER PARTICIPATION

Panamá. National Authority for Governmental Innovation. Terms of Reference

Archives of Scientific Psychology Reporting Questionnaire for Manuscripts Describing Primary Data Collections

United Nations Development Programme GUIDELINES FOR EVALUATORS CONTENTS

Principles and Characteristics of the COSA System. A Concise Introduction

The Logical Framework: Making it Results- Oriented

The Kresge Foundation Director of Strategic Learning and Evaluation

About CIDA - Performance - Results-based Management - The Logical Framework: Making it Results-Oriented

Environmental Program Evaluation in Government: A Perspective from the US

Project Design and Performance Management Guide A Logic Modeling Approach [WORKING DRAFT]

MACRO Student Evaluation

Certified Clinical Supervisor (CCS) APPENDIX A

Youth Success RFP Evidence Base for How Learning Happens 3. GHR Connects,

Evaluation Statements of Work

IMPLEMENTATION SCIENCE BREAKOUT

Designing a Competency Based Model for Performance Management. Presented by:

Pushing Forward with ASPIRE 1

TOOL 9.4. HR Competency Development Priorities. The RBL Group 3521 N. University Ave, Ste. 100 Provo, UT

To define the size and nature of the opportunity realisable by effecting: a) quantum change within the organisation;

Strategic Plan FY14 - FY18

Three Bold Steps Toolkit: Capacity Framework

Social Administration Learning Contract

OPTIMAL USE PROGRAM DRUG Therapeutic Review Framework and Process

CareerMotion PRESENTATION

What is a Situational assessment?

The Adviser Who Knows the Client Best, Wins

O ver recent years the desire for HR to act as a strategic partner within organizations

Sustainability Toolkit and Learning Collaborative PREPARED FOR BMS FOUNDATION GRANTEE SUMMIT APRIL 11, 2018

Continuing Competency Performance Assessment Certified Health Education Specialists (CHES ) Master Certified Health Education Specialists (MCHES )

Foreword... iii. 2 Situation Analysis... 1 Current status... 3 Key issues affecting Health Systems Global... 3

Santa Clara County DOR/Mental Health Cooperative Data FY 11/12 FY14/15. Clients served, including rollover Placed. Referrals received

REQUEST FOR PROPOSALS

An Examination of Assessment Center Quality

Marketing Accountability

PROGRAM EVALUATIONS METAEVALUATION CHECKLIST (Based on The Program Evaluation Standards) Daniel L. Stufflebeam, 1999

Planning a Customer Survey Part 1 of 3

Performance Management Readiness

Design Options for Home Visiting Evaluation

FOR. Developed by The Maine Commission for Community Service 38 State House Station, Augusta, ME

Developing a 5 Year Strategic Framework: Charting a New Course for Malaria Communication. July 26-27, 2011 Southern Sun Hotel Nairobi, Kenya

Development of Needs Based Planning Models for Substance Use Services and Supports in Canada. Updated Project Summary

NBR E-JOURNAL, Volume 1, Issue 1 (Jan-Dec 2015) ISSN EVALUATION OF TRAINING AND DEVELOPMENT FOR ORGANIZATIONAL COMPETITIVENESS

1 Introduction. 2 Program Evaluation Standard Statements. Int'l Conf. Frontiers in Education: CS and CE FECS'16 173

organization, qualitative research on activists, and personal contacts.

Indicators Of. Thinking. Bruner Foundation Rochester, New York

SCDLMCE5 Develop operational plans and manage resources to meet current and future demands on the provision of care services

NATURAL RESOURCES CONFLICT RESOLUTION PROGRAM THE UNIVERSITY OF MONTANA

Planning & Program Implementation National Replication 9/30/2015

Allen Lewis Virginia Commonwealth University Richmond, VA. Sarah Jane Brubaker Virginia Commonwealth University Richmond, VA

4 The balanced scorecard

Self-evaluation form Form 1: Research and innovation actions Innovation actions Form 2: Coordination & support actions

WORK PLAN OVERVIEW. PHAO Toolkit - Work Plan Page 1. Work Plan Overview Final 5/13/2013 1

Winning your next proposal: Buzz Tactics to increase the chances of success Joseph Lane, Jennifer Flagg, James Leahy

Guidance: Quality Criteria for Evaluation Reports

Impact Evaluation. Some Highlights from The Toolkit For The Evaluation of Financial Capability Programs in LMIC

Strategic Thinking for the New Reality: Creating and Executing a Blueprint for Your Custom Building Business

Strengthening Resilience in Tsunami-affected Communities in Sri Lanka and India Getting Started on Monitoring and Evaluation Plans Notes for Partners

LESSON 2: INTRODUCTION TO STRATEGIC MANAGEMENT

Guide to Strategic Planning for Schools

AVAILABILITY, RESPONSIVENESS, CONTINUITY (ARC):

Master of Social Welfare (MSW) Program Assessment Plan and Outcomes Assessment Results,

Flexibility on what is delivered High

Mentoring Program for the Water Sector: A best practice guide

Position Description: Alcohol and Drug Outreach Liaison Worker (ADLOW)

Overall DrPH Competencies Doctor of Public Health University of South Florida, College of Public Health

To Motivate Employees, Help Them Do Their Jobs Better

Defining And Building A Culture Of Sustainability

Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized

Design, Monitoring & Evaluation Coordinator

EXCELLENCE 1.1 Quality, innovative aspects and credibility of the research (including inter/multidisciplinary aspects)

Opportunities for Sustainability

ACFID Code of Conduct PMEL Guidance Note. Prepared for ACFID by Learning4Development

Practices for Effective Local Government Leadership

KEY TERMS & DEFINITIONS IN STRATEGIC PLANNING

Work Plan and IV&V Methodology

Transcription:

EMT Associates, Inc. Approach to Conducting Evaluation Projects EMT has been a leading small business in the evaluation field for over 30 years. In that time, we have developed an expertise in conducting scientific studies and policy research that is applied in evaluation projects regardless of the subject matter domain. Our evaluation experience includes: Ten large, national multi-site evaluation studies (including an award-winning 48-site evaluation of substance abuse interventions for high-risk youth) More than 150 policy and program evaluations at the state or local level For every evaluation engagement, we believe that the policy problem and client needs drive the choice of data collection, measurement, and analysis. We are committed to implementing the most rigorous research possible consistent with resource constraints and client information needs. Interpreting results and communicating lessons with clarity and relevance are priorities in our information products. Accordingly, EMT draws from a full toolbox of research and analysis methods in these projects. Our research skills are both qualitative and quantitative. In this paper, we present our approach to conducting evaluation and policy analysis projects by addressing: Guiding principles for conducting evaluation projects Types of evaluation projects policy, program, and performance Evaluation design and methods Multi-site research projects Interpreting and communicating evaluation results Technology resources supporting evaluation projects Guiding Principles for Conducting Evaluation Projects Through our three decades of evaluation work, EMT has developed an evaluation philosophy that embraces the following principles. 1. Our evaluations emphasize relevance to real world problems. Useful evaluation depends upon clear articulation of researchable problems as they emerge from social, cultural, organizational, and policy environments. Our studies begin with collaborative development of a conceptual framework that articulates study issues and objectives, and relates them clearly to the context in which findings will be used. Clear grounding in reality requires collaboration between decision makers who will use the results, the evaluation team that will produce the results, and diverse stakeholders. 1

2. Our evaluation approach emphasizes the appropriate application of rigorous research tools. The credibility and value of evaluation products as an input to decision making is dependent on the quality of the measurement, data collection, analysis, interpretation and reporting used. Moreover, the utility of information for audiences depends on a) using the specific tool that best fits the empirical context of the problem being addressed, and b) using methods that accurately identify and communicate the most relevant information for the purposes of the study. EMT carefully fits particular evaluation methods and techniques to achieve an optimum balance between analytic complexity and rigor, information needs, and utility for users. 3. Our evaluation products have a clear utility for guiding action. EMT is committed to excellence in conceptual and empirical analysis to achieve important social objectives. We have successfully put this principle into practice through a collaborative evaluation approach and a commitment to translating evaluation results into clear guidelines for practice. Many of our evaluation projects involve recommendations for improved procedures and capacity for the client. 4. We are committed to collaborative support in the utilization of evaluation results. We apply our evaluation results through participation in decision making, development of evidence-based products, and for many projects involvement in follow up training and TA to ensure utilization of study results. We believe adherence to these principles provides the appropriate balance of cultural and contextual sensitivity, collaborative project design and management, quality of evidence, and relevance of interpretation and application that is necessary to the success of the evaluation project. Types of Evaluation Projects: Policy, Program, and Performance In approaching any evaluation project, it is useful to understand at the outset what type of evaluation work the client is seeking. We can distinguish between three major types of evaluation: policy, program, and performance. To be sure, many evaluation projects incorporate a combination of these evaluation types. Policy evaluation is characterized by its focus on supporting decision makers who will make active use of evaluation products. Briefly stated, this category places a premium on a) understanding the motivating problem conditions being addressed, b) understanding the decision environment, including stakeholder needs for information and the context of the decision process (e.g., timelines are critical), and c) having the capacity to find and generate data sources and conduct analyses that meet priority decision needs on time. The distinguishing characteristic of program evaluation is a focus on assessment of specific program activities and the implementation of specific initiatives within the client organization. The challenge in program evaluation is to thoroughly document both processes and outcomes 2

associated with the program or initiatives under review, and then construct evaluation products that effectively inform the client organization about program/initiative strengths and weaknesses. In many program evaluation projects, we are asked to provide recommendations as to opportunities for improvement. Program evaluations designs vary in the degree of focus on process issues or outcomes. The design may vary as well as to the degree to which objectives have been achieved or the degree to which program procedures and planned activities have been implemented. Greater emphasis on outcomes is, not surprisingly, traditionally termed outcome evaluation. The emphasis in outcome evaluation is often on associating observed outcomes with specific program activities that appear to be connected to these outcomes. In contract, process evaluation focuses on measuring implementation strength (e.g., amount of service, quality of service) and fidelity (e.g., delivering services as planned). EMT generally advocates an approach to program evaluation that is balanced by incorporating both process and outcome analysis. Performance evaluation is yet another type of evaluation project that can be distinguished from both policy and program evaluation. Increasingly, performance evaluation is being accomplished through processes incorporated into management information systems creating what is typically referred to as a performance monitoring system. Performance monitoring is an important development in evaluation because a) it clearly links program design (e.g., program theory, logic models) to data collection by specifying implementation target indicators (e.g., outputs, service recipient profiles), quality indicators (e.g., fidelity measures, consumer satisfaction, utilization of service), with output performance (e.g., knowledge, attitude, behavior change; attainment of performance targets such as increase in test scores); b) it provides continuous monitoring of the relation of program effort to outcome change; c) it provides continuous management information concerning failure and success points that may explain outcome deficiencies; and d) it provides a management tool for continuous quality improvement. Evaluation Design and Methods All three types of evaluation policy, program, and performance have in common the need to define study objectives, collect and analyze data, and present results and implications. To support these fundamental steps of evaluation, it is necessary to develop a detailed evaluation study design and to select appropriate evaluation methods. EMT has experience with a full range of evaluation designs, and an understanding of the conditions under which each a) best meets the specific information needs of the request, and b) is constrained or enhanced by the particular field circumstances of the study. The desirability of a specific design is not simply determined by its rigor; for example randomized trials are the best design for a relatively limited set of studies that focus on the internal validity of a well-defined intervention. While EMT understands the value of randomized designs in several variations, and 3

uses them when appropriate, we have developed expertise in a variety of alternative designs that fit other circumstances. EMT has been a leader in the application of some of these design alternatives. We will provide an extended example with our work in natural variation designs. Natural variation designs are increasingly recognized as the optimal design standard for large multi-site evaluations (MSEs), particularly when they involve complex systems (for example, community coalitions with significant discretion in implementing strategies in diverse community settings). The natural variation design is a recognized method for comparative systems studies where the multiplicity of potentially relevant factors makes experimental comparison impractical, error-prone, and very weak with respect to external validity (generalizability). When generating lessons for application in different communities or schools is a goal, the lack of external validity inherent in experimental (similar system) designs is a serious limitation. In sum, efficacy trials focusing on experimental design lack the contextual relevance and complexity necessary to provide meaningful guidance for practitioners in real world situations. Natural variation designs use the inevitable contextual variation and complexity in large MSE s to analytic advantage. Rather than trying to control this variation to maximize internal validity, natural variation studies test for the robustness of the relationships between key organizational, intervention, and implementation variables and effectiveness (e.g., effect sizes or other comparable measures of effectiveness) over a large number of settings. The emphasis is on measurement of important characteristics of operation, intervention and implementation so that they can be correlated with effectiveness measures. External validity (generalizability) is maximized EMT also has experience in a variety of mixed method and multi-method designs. Mixed method designs incorporate a variety of methods into a single measurement process, or analysis approach. As an example, EMT has developed a site visit measurement procedure that mixes closed-ended judgments, ratings, and calculated measures with open-ended commentary (e.g., key informant interviewers, ambient observation by field visitors) into a single measurement protocol. This process gives unique ability to develop standard, comparable measures across sites while providing contextual information that helps the validity of the measures within site, provides flexibility in modifying measures as the study proceeds, and aids in interpretation of analysis. In summary, EMT places a premium on developing particular designs that maximize the quality of information for meeting specific information needs in actual study environments. Conceptual frameworks (e.g., logic models, process maps) are an effective tool for many evaluation projects. Creating a close logical connection of data and analysis procedure to planned and actual program activities is at the heart of useful evaluation, and EMT has developed a variety of mapping, flow chart, and logic model formats to meet the needs of different programs. Indeed, in a recent project with the Tennessee Department of Mental Health we helped the state develop a needs assessment process closely tied to their contracting and performance monitoring processes. The success of this linking of policy logic, program design and data was central to the 4

Department winning one of only four state awards in the highly competitive Partnerships for Success program. For useful evaluation, it is critical to develop measures that a) are valid (both face and empirical) indicators of specific concepts in policy or program logic, b) have sufficient variance to reflect meaningful differences in real world conditions of interest, c) are reliable, precise and accurate enough to support meaningful analysis (e.g., not be swamped by measurement error); and d) are reasonable with respect to burden and resource needs. A strong emphasis on measures relevance and variance characteristics is essential to solid evaluation. While this is critical in all applications, it is particularly important in performance monitoring because of the explicit link to continuous decisions. Multi-Site Research Projects EMT has designed and implemented nine national multi-site studies. These projects have involved anywhere from 12 to 48 sites, had budgets from one to 12 million dollars, and durations of between three and six years. Large, multi-site research projects are especially significant because they generate science-based evidence with clear utility for policy and professional practice. These studies have given us strong capacity in a) collecting and managing large, multilevel, multi-site data sets, including the development of scrubbing techniques and methods of refining measures in multi-context environments; b) collecting multi-method data from program records and through site visits to develop measures of program context, procedures, and implementation; and c) facility with a variety of statistical analysis approaches, including SEM modeling, HLM, and adaptations of meta-analytic technique that meet the requirements of multisite field research. Interpretating and Communicating Evaluation Results Accurate interpretation and effective communication of results is critical to the utility of evaluation findings. Throughout this discussion, we have reiterated EMT s perspective on ensuring that evaluation measures, study design, and analysis are formulated with strong consideration of how these decisions will affect the ultimate usefulness of the information for decision-making and for improvement of program, processes, or outcomes. We are just as careful about the presentation and dissemination of information suitable for various audiences. We recognize, for example, that providing useful feedback (e.g., data properly de-identified, findings) to sites participating in studies is a benefit and increases study cooperation and quality. We use a variety of dissemination and presentation techniques, including in-person presentations, written reports, briefs and summaries, and videos in appropriate circumstances. 5

We also collaborate in disseminating information in professional and research journals, coauthoring with agency personnel, and co-presenting at appropriate professional conferences. Technology Resources Supporting Evaluation Projects In addition to human expertise, the successful management of an evaluation project requires specific technology resources and equipment. EMT utilizes the following technological capacities: Statistical software. EMT uses SPSS as its standard statistical software, making sure that we have the full and latest SPSS version. We also have additional software (e.g., M-plus) for special applications. Online survey software and services. EMT frequently uses Survey Monkey, an online survey administration tool. EMT extensively uses email listserv and discussion group software (LearnLinc) to support its external work with partners, consultants and grantees, and internally with its staff. EMT has a state-of-the-art digital phone system with built-in conferencing capabilities. EMT Live is EMT s videoconferencing capability utilizing ilinc Web and videoconferencing. An Ethernet-based local area network (LAN). Our network serves all staff, as well as conference rooms, with a minimum of 100-megabyte access. The system has a comprehensive back-up process and off-site storage.. Conclusion The intent of this paper has been to describe EMT s approach to conducting evaluation projects. Our approach emphasizes collaboration with the client, the selection of evaluation design elements that balance methodological rigor with practicality and resource constraints, and a focus on producing results that have relevance and utility in organizational operations. EMT has the experience and resources to conduct evaluation projects in a way that ensures positive results for our clients. For additional information, please contact EMT at (916) 983-6680 or email us at info@emt.org. 6