Environmental Program Evaluation in Government: A Perspective from the US

Similar documents
Prospectus for the Certificate Program in CAREER DEVELOPMENT FOR REGULATORY PROFESSIONALS

Chief Scientist - Director of Research and Evaluation

Chief Scientist - Director of Research and Evaluation

Strategic Plan FY14 - FY18

THE CHALLENGE OF MEASURING PERFORMANCE (FROM THE NSF POINT OF VIEW)

Equal Measure Senior Director of Evaluation

An Evaluation Roadmap for a More Effective Government

THREE -YEAR STRATEGIC PLAN

Public Private Dialogue in the making of the Unified Enterprise Law and the Common Investment Law in Vietnam

Democratized Data Driving Discovery RENCI Strategic Plan 2017

Society for Social Work and Research Strategic Plan

Prospectus for the Certificate Program in CAREER DEVELOPMENT FOR REGULATORY PROFESSIONALS

The Endocrine Society Strategic Plan 2011 (Approved by Council on June 2, 2011) Strategic Plan Overview

USAID PROGRAM CYCLE OVERVIEW

EMT Associates, Inc. Approach to Conducting Evaluation Projects

WOMEN S FINANCIAL INCLUSION COMMUNITY OF PRACTICE (COP) CHARTER. February, 2017

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

National Commissioning Board. Leading Integrated and Collaborative Commissioning A Practice Guide

Evaluation, Evaluators, and the American Evaluation Association

Learning to monitor think tanks impact: Three experiences from Africa, Asia and Latin America 1

Five-Year Strategic Plan

Fifth Conference of the Latin America and the Caribbean Monitoring and Evaluation Network

REPORT TO THE BOARD OF GOVERNORS UBC FINANCE AND IT GOVERNANCE EXTERNAL REVIEW

Building the Business Case for Learning Technology Systems. Business Builder

Final Report Evaluation of Translation Bureau Programs Volume 2: Translation and Other Linguistic Services Program

Strategic Partnerships

White Paper: Developing a Strategic Plan for SPNHC

Questions and Answers with Key Executives from Akcelerant and Temenos

The Kresge Foundation Director of Strategic Learning and Evaluation

POSITION PROFILE FOR THE CHIEF OF THE WINNIPEG POLICE SERVICE. Last updated October, 2015

Foreword... iii. 2 Situation Analysis... 1 Current status... 3 Key issues affecting Health Systems Global... 3

National CASA Association Performance Measurement Summary Child Trends Final Report

Educational Master Plan Goals, Objectives, Benchmarks

American Cancer Society Case Study: Performance Drivers for Productive Fund Raising. June 2006

Advisory & Client Services

President and Chief Executive Officer Seattle, Washington

Performance Reports vs. Evaluation Reports: Two Case Examples from the Social Sciences and Humanities Research Council of Canada (SSHRC)

Kaplan-Norton Balanced Scorecard Certification Boot Camp

BRIDGING THE DEVELOPMENT PARTNERSHIP GAP FIVE STRATEGIES FOR BETTER PARTNERSHIPS

The Utilisation of Social Science Research in Policy Development and Program Review

Request for Proposals Integrated Strategic-Operational Plan

MARCH 2018 INDEPENDENT PERFORMANCE EVALUATION, CANADA HEALTH INFOWAY. Executive Summary of the Final Report March 2018

Long-Range Research Initiative Global Research Strategy. 21st Century Approaches to Risk Sciences

SUSTAINABILITY SELF-ASSESSMENT TOOL

2017 ERM Workshop: Beyond Compliance, Driving Organizational Value

Putting Strategic Partnerships for Intercultural Competencies into Practice

SCIENTIFIC LADDER. Scientific Ladder February 2012 Page 1 of 12

DEVELOPMENT OF PAN-CANADIAN DISCIPLINE-SPECIFIC COMPETENCIES FOR HEALTH PROMOTERS SUMMARY REPORT CONSULTATION RESULTS

Copyright 2016 The William Averette Anderson Fund 501(c)(3)

1 Introduction. 2 Program Evaluation Standard Statements. Int'l Conf. Frontiers in Education: CS and CE FECS'16 173

CSR / Sustainability Governance and Management Assessment By Coro Strandberg President, Strandberg Consulting

Cultural Center of the Philippines

IPMA-CANADA INTERNATIONAL CERTIFICATION PROGRAM IPMA-CP (IN TRAINING) IPMA-CP IPMA-ACP IPMA-EX IPMA-CE

Job description and person specification

White Paper: Communication, Relationships, and Business Value

INNOVATION UNIT PROJECT INTERNSHIP (SERVICE DESIGN) JOB DESCRIPTION SPRING

JOB DESCRIPTION: EXECUTIVE DIRECTOR PIVOT LEGAL SOCIETY

Characterizing the long-term PM mortality response function: Comparing the strengths and weaknesses of research synthesis approaches

WOMEN IN INTERNATIONAL AFFAIRS NETWORK

Final Audit Report. Audit of Information Technology (IT) Planning. June Canada

Monitoring & Evaluation in the GEF

Quality Assurance for the Recognition of Prior Learning (RPL) in Canada THE MANUAL. An Introduction

Ideas for the Future - The Corporate and Insurance Client Relationship

INTRODUCTION KEY LESSON FOR 2016 IT S ALL ABOUT TIME SURVEY DETAILS

Field Guide to Consulting and Organizational Development

2017 Law Firm Marketing Operations Index

Connecting Transformational Leadership and Employee Engagement Interview with Dr. Aisha Taylor

Frequently Asked Questions (FAQs)

Building a Business Case for Shared Services

Building Resilient Communities Workshop: Executive Summary

Practice Guide. Developing the Internal Audit Strategic Plan

4/16/2018. LEAD: A Diagnostic Approach to Improving Management Team Performance ACHCA Convocation April 2018 Dr. Douglas Olson, Lisa Thomson

STRATEGIC PLAN

Boosting Decent Employment for Africa s Youth. Request for Concept Notes

PRESIDENT & CHIEF EXECUTIVE OFFICER

Monitoring and Evaluation Policy

Report of the review of the application of the Framework for Improving Quality in our Health services: Learning to guide future approaches.

IPMA-Canada Certification Program

Navigating a Changing Landscape: Financial Adaptability Nima Krodel, Vice President April 27, 2017

Use of Technology for Development and Alumni Relations among CASE Members

Tuition Assistance Programs: Executive Summary

Next Generation Vision for Illinois:

Webinar Reminders. 1. Everyone is muted. Press *6 to mute yourself and *7 unmute. 2. Remember to chat in questions along the way!

Successful Delivery of Change How Managing Benefits is helping Dubai Customs optimize its return on investment

EUA s Response to the Consultation on the Revision of the EU s Modernisation Agenda

JOB DESCRIPTION Position Monitoring and Evaluation (M&E) Coordinator - Hub Grade C2 Department & Location

Professionalizing the Discipline of Enterprise Architecture. Leonard Fehskens VP, Skills and Capabilities The Open Group

DATA-INFORMED DECISION MAKING (DIDM)

Working with Consultants Amy Reisch, First 5 Marin Jara Dean- Coffey, jdcpartnerships

Long-Range Plan Federation of Students, University of Waterloo

Executive Vice President, Member Networks Washington, D.C.

Toolkit. The Core Characteristics of a Great Place to Work Supporting Framework and Tools. Author: Duncan Brodie

Cultural Planning Toolkit Web Resource Integrating Community and Cultural Planning

Integrated Planning and Institutional Effectiveness: Improvement and Renewal

Strategic Plan

UNDERSTANDING WHAT WORKS FOR TAMARIKI

Delivering Value Why Else Are You Doing The Project?

For: Approval. Note to Executive Board representatives. Document: EB 2018/LOT/G.4 Date: 1 November 2018 Distribution: Public Original: English

COLLECTIVE IMPACT 101

THE CALIFORNIA CAREER PATHWAYS TRUST

Transcription:

Environmental Program Evaluation in Government: A Perspective from the US Presentation to Environmental Evaluators Network-Canada Ottawa, Canada September 2009 1 Katherine Dawes Evaluation Support Division National Center for Environmental Innovation US Environmental Protection Agency

Observations: Government, Evaluation and Environmental Organizations Evidence-based Policy is today s Good Government Slogan Program evaluation in the federal government has been evolving dramatically since late 1990 s. strongly influencing US investments in public health, education and international development Most evaluation policies in the US has been driven by think tanks, policy-makers, and policy advocates but rarely by evaluators themselves, including Federal Evaluators Environmental organizations (government/foundations/ngos), face strong drivers for showing evidenced-based results, but: 2 Crash into culture change issues when promoting evaluation Do not have formal evaluation policies or normative practices Typically lack strategic investment or evaluation planning Have a slim bench of accessible, high-quality program evaluation consultants Evaluation energy often focused on reporting measures

Program Evaluation Has a Key Role in Performance Management PERFORMANCE MANAGEMENT Performance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation. Logic Model Tool/framework that helps identify the program/project resources, activities, outputs customers, and outcomes. Performance Measurement Helps you understand what level of performance is achieved by the program/project. Program Evaluation Helps you understand and explain why you re seeing the program/project results. 3

Orientation/Approaches to Evaluation Audit/Inspection Shops Evaluation Shops Accountability External Audience What is the level of performance? Did the program meet requirements? Should the program continue? Learning & Program Improvement Internal/External Audiences What is the level of performance? What outcomes have been achieved and why? What aspects of the program and/or intervention lead to these outcomes? What roles did context play in the outcomes? How can the program improve? Should the intervention scale-up? 4

The Program Evaluation Field American Evaluation Association (AEA) a leading international organization Over 5500 members representing all 50 US states and over 60 foreign countries Guiding Principles for Evaluators (1994, revised 2004):approved by AEA membership to guide the professional practice Evaluation Policy Task Force: started in 2007 to identify policies critically important to the practice AEA s history, and connections to nearly 100 professional societies worldwide exemplifies evolution and professionalization of the field, especially in past 15 years Canadian Evaluation Society Groundbreaking Professional Designations project Close partnership with AEA quadrennial joint conference, joint membership Peer-reviewed journals American Journal of Evaluation, Canadian Journal of Program Evaluation 2009 special issues on Environmental Program Evaluation: Evaluation and Program Planning and New Directions for Evaluation 5 Universities offer graduate programs or certificate programs in evaluation. Predominately in education and public health departments, some public administration Considered a multi-disciplinary social science Practice has significant degree of specialization by topic and/or method, with many Subfields Education and Public Health are the oldest, biggest and strongest (1960s) Environmental Program Evaluation is one of the newer, smaller and weaker (2000s) Stronger subfields have a greater degree of alignment on data collection, performance measurement and evaluation practice among key players

Environmental Program Evaluation Subfield Environmental Program Evaluation has some practice, little theory Environmental Education, Energy Efficiency, and R&D Evaluation are on the forward edge of the curve, but have specialized needs Biodiversity conservation has project-level adaptive management history AEA has thousands of experts (academics, consultants, practitioners) who focus on practice and theory for evaluation in education and/or public health, but the environmental field has few Weak partnerships on common performance measures connected to weak (or non-existent) partnerships on program evaluation 6 Many of the subfield s problems are not unique to the subfield though relative inexperience makes the challenges more acute 6

History of Program Evaluation at EPA 1980 1995 Program evaluation housed in the policy office, close connection to strategic planning Mostly process evaluations for internal audiences, with heavy reliance on qualitative interview methods No networking with academics or other feds in the broader program evaluation field or discipline In-house consulting staff hit its peak in the early 1980s (25-30 FTE), then shrank during mid-1980s reduction of evaluation resources across federal government Division dispersed in 1995 due to lack of management support 2000-2008 During 2000 reorganization into a policy & innovation office, Agency senior managers ask that the program evaluation function be reestablished Original expectation: support innovation projects, build capacity of other offices Evolved expectation: advocate and provide training for the full range of performance management, maintain centralized and sophisticated program evaluation expertise Broader portfolio of evaluation types (process, outcome, impact, design), and methods (qualitative and quantitative) 7 EPA s experience representative of the environmental program evaluation Subfield : growing networks with external experts, other federal agencies, and professional societies EPA s Deputy Administrator (in Chief Operating Officer role) was a champion of evaluation s role in performance management 7

US EPA s Evaluation Support Division (circa 2009) ESD s Dual Mission Innovation Analysis Support Agency innovation and system change by evaluating and communicating the results and lessons learned of innovations Capacity Building Build capacity of EPA to conduct program evaluation throughout the Agency Small staff with expertise in various technical and cultural aspects of the program evaluation field Limited resources to support a Leveraging and Networking Approach Develop tools that relay basic information and guidance Run EPA ss annual program evaluation competition Manage 3 rd Party evaluations of innovations Deliver training and technical assistance Leverage networks of evaluation experts and clients Innovation analysis and capacity building often overlap, balance in portfolio shifts as needed 8

US Federal Evaluation Drivers 1993-2007 Government Performance and Results Act (1993) President s Management Agenda (2002) Office of Management and Budget Program Assessment Rating Tool calls for independent, objective program evaluations (2002) Includes guidance that describes Randomized Control Trials as the federal program evaluation gold standard 9 President s Executive Order 13450 Improving Government Program Performance (2007) Performance Improvement Council sponsors OMB s Evaluation Working Group

US Federal Evaluation Policies: What s Next? 10

I will also go through the federal budget line by line, eliminating programs that no longer work and making the ones that do work better and cost less, because we cannot meet 21 st -century challenges with a 20 th -century bureaucracy. 11 Then-Senator Barack Obama September 2008

the Budget I am sending to you includes a separate volume of terminations, reductions, and savings that my Administration has identified... In it, we identify programs that do not accomplish the goals set for them, do not do so efficiently, or do a job already done by another initiative. Overall, we have targeted more than 100 programs that should be ended or substantially changed, moves that will save nearly $17 billion next year alone. 12 President Barack Obama Budget Transmittal Letter to Congress May 2009

The Obama Administration will work with [career federal executives] to fundamentally reconfigure how the Federal Government assesses program performance A reformed performance improvement and analysis framework also would emphasize program evaluation. Just as the Administration is proposing historic investments in comparative effectiveness research so that our health care services will produce better results, the Administration will conduct quality research evaluating the effectiveness of government spending in order to produce better results. 13 President s Budget May 2009

Rigorous ways to evaluate whether programs are working exist. But too often such evaluations don t happen. They are typically an afterthought when programs are designed and once programs have been in place for awhile, evaluating them rigorously becomes difficult from a political economy perspective. This has to change Wherever possible, we should design new initiatives to build rigorous data about what works and then act on evidence that emerges... By instilling a culture of learning into federal programs, we can build knowledge so that spending decisions are based not only on good intentions, but also on strong evidence that carefully targeted investments will produce results. 14 Peter Orszag, Director Office of Management and Budget OMB Blog Entry 6/8/2009

Federal Evaluation Policies: Leadership Questions Key Leadership at Office of Management and Budget asking : Where do we really need program evaluation? What Federal institutional support is needed? What needs to change in the Federal evaluation culture to improve the discipline across-the-board? Should the US adopt similar policies to the Standard on Evaluation for the Government of Canada? 15

Program Evaluation at US EPA: Leadership Questions What Direction? The Policy Office is undergoing a Reorganization: How does program evaluation fit in? How will we configure EPA s Program Evaluation function in light of: Obama Administration management policies (e.g., evidencebased decision-making, increasing productivity and data transparency? 16 EPA Administrator Lisa Jackson s priorities for the Agency? (Environmental Justice, Children s Health and Climate Change) The state of EPA s Evaluation Culture?

Evaluation Culture in Environmental Organizations: Observations Evaluation Appreciation Program improvement perspective Prospective focus Learning perspective Lessons learned, best practices Helps me fix problems Done with me I decide to opt-in Good government We should be doing it Added value Training & capacity building efforts Evaluation Apprehension Accountability perspective Retrospective focus Threatening Gotcha! mentality Identifies problems I lose control Done to me Someone else says we have to do it Luxury item; not enough $ Inadequate staffing within programs 17

Where Is Environmental Program Evaluation Body of Work on the Evaluation Spectrum? Design Evaluation HOW WHY Resources/ Inputs Activities Outputs Customers Short term outcome Intermediate outcome Longer term outcome (STRATEGIC AIM) 18 Process Evaluation Outcome Evaluation Impact Evaluation Adapted from Evaluation Dialogue Between OMB and Federal Evaluation Leaders: April 2005

Overarching Recommendations from Peer-reviewed Literature EPA/ESD has been leading a meta-analysis of peer-reviewed literature with findings related to improving the evaluation of environmental programs. Over 300 articles reviewed to date, article list updated periodically Have shared and consulted with academics writing about environmental management and natural resource conservation Overarching Recommendations Commit to an Evidence-Based Culture: Like other disciplines such as medicine and public health, the environmental community must commit to an effectiveness revolution where decisions are based on evidence of effectiveness. Embrace Collaboration: Collaboration across organizations, disciplines and stakeholders is necessary to efficiently improve the effectiveness of environmental programs. Clearly Define Common Terms: Clearly defining terms and approaches used in evaluation will enable collaboration that leads to better environmental practices. 19 Integrate Evaluation Into Program Design: Integrate evaluation into programs so that programs can develop measures that enable more efficient improvement and more clearly demonstrate program effectiveness. Clearly Define Program Goals and Objectives: Collaboratively developing goals that clearly communicate the purpose of activities will result in more meaningful measures. Develop Clear and Diverse Measures of Success: Measures should be clearly connected to program goals and activities, and measures should be interdisciplinary and developed for different parts of the program cycle, i.e. activity, output, outcome. Adopt State-of-the-Art Evaluation Methods: Use a diversity of evaluation methods that are aimed at determining program effectiveness and impact. Use Evaluations: Use evaluations to improve programs, determine their effectiveness and improve decision-making.

How Do We Get to Where We Want to Go? 20

Rebalance Our (Collective) Evaluation Portfolios We need more process evaluation, but we need even more impact evaluations! How many times have our evaluations found that: no data is available to determine environmental impact Grappling with our data sources and protocols Available data neither obviously nor easily useful in answering questions about the impact of the program because of a number of issues related to the data collection: Proximity, frequency of collection differences in collection protocols across locations and monitoring programs, Need to assess availability of data, document why it is or isn t useful to the program, and document gaps in data that hinder the program s ability to determine if it has an impact on priority environmental characteristics. Reframe monitoring programs Many established when main goal was to determine the status of the environmental (baseline) and begin to document trends in environmental quality Several in control of individual academic institutions: closed access and no universal protocols, little connection to determining program effectiveness in affecting environmental change. 21 Integrate evaluation into program design Represents an opportunity to improve our questions about the environmental impact of programs. Ask the questions about the effectiveness of the program or policy prior to implementation, Hone in on the specific questions of interest and the data requirements and availability during the design stages. identify data gaps and opportunities or needs to collect new data help programs to set environmental goals and objectives that are more realistic and measurable

Leverage and Build Networks Develop working relationships with expert practitioners and academics through our major Associations (AEA, CES) Environmental Program Evaluation Topical Interest Group Evaluation Policy Task Force Other?? Collaboration for Environmental Evidence: support development of systematic review process for environmental interventions -- modeled after medical field Centre for Evidence-based Conservation at Bangor University (UK) Georgia State exploring establishing US center Opportunities in Canada? 22 Environmental Evaluator s Network: build connections between environmental policy academics, evaluation practitioners, and evaluation funders; increase sophistication of the environmental evaluation subfield