Evaluating your educational initiative

Similar documents
Evaluation & Accountability

Logic Models A Practical Guide January 2016

Developing a logic model

A workshop designed to expose basic Market Needs Assessment practices for future Entrepreneurs

Table of Contents Page

Training Needs Analysis

VOLUNTEER MANAGEMENT CERTIFICATE PROGRAM

Measuring the Impact of Volunteering Alan Witchey Volunteer Center Director United Way of Central Indiana

METHODOLOGY FOR INCOME SURVEYS: TO DETERMINE LOW AND MODERATE INCOME BENEFICIARIES OF NEIGHBORHOOD INVESTMENT PROJECTS

How to Make the Most of Your Mentoring Experience: A Practical Guide for a Successful Partnership

Executive Summary April 2009

Training on Project Cycle Management for Researchers Ethiopian Academy of Sciences May 2016, Ghion Hotel, Addis Ababa

PERFORMANCE AND QUALITY IMPROVEMENT PLAN 2019 TABLE OF CONTENTS

EVALUATING SERVICES BY

The Assessment Center Process

MICHIGAN SUGAR YOUTH SUGARBEET PROJECT. Name. Signature. Date of Birth. Score. Address. Address. Agriculturalist

Capitalizing on Program Evaluation

Using Your Logic Model to Plan for Evaluation. Exercise Posing Evaluation Questions. The Importance of "Prove" and "Improve" Questions

An introduction to Social Return On Investment

1 P a g e MAKING IT STICK. A guide to embedding evaluation

Generating Members, Volunteers and Leaders in Community Organizations

What difference did we make?

BUSINESS MOTIVATORS. Terri Tester ACME Inc. 03/31/17 REPORT PROVIDED BY

EVALUATION OF MICHIGAN SUGARBEET ADVANCEMENT PROGRAM. March 2001

AUDIT Where are we now? ONGOING MEASUREMENT Are we getting there?

Survey on Agent Training and Support

NQF Level: 4 US No:

Taxonomy for Assessing Evaluation Competencies in Extension

DEMYSTIFYING EVALUATION

Report For: Terri Tester ACME Inc. 12/30/2009

Notes. CIMA Paper E2. Enterprise Management. theexpgroup.com

Make the most of simulators by understanding five key factors for success

HOW TO INCREASE THE INCOME FROM YOUR PRACTICE. By Roger P. Levin, DDS GROW IMPLEMENT ANALYZE

TECHNICIAN INTERVIEW GUIDE. Check Again to Ensure Eligibility for Hire

How does a Canadian non-profit or charity obtain cost effective and useful legal services from a lawyer?

How to create scenarios for change

Moving People to Action Working with Volunteers

Lesson 94: Agriculture (20-25 minutes)

18 Tactics to Motivate Your Referral Sources

GUIDANCE ON CHOOSING INDICATORS OF OUTCOMES

SUMMARY RESEARCH REPORT

Engaging the Community & Building Capacity. Facilitated by: Martha Madrid & Andrea Valdez, MPA

PUBLIC RELATIONS STRATEGIES LEVEL 4 PROJECT

Carrying out Analysis in Business Part 1

EMT Associates, Inc. Approach to Conducting Evaluation Projects

Social work. Handbook for employers and social workers. Early Professional Development edition

تقویت صدای ایرانیان PLANNING FOR AN ADVOCACY CAMPAIGN

Genetically Engineered Organisms Perspective A

Hartford Public Library Facilities Manager Requirements:

This document briefly outlines some of the words you might come across when looking at evaluation and research.

Market Research Firms. Syndicated Service Research Firms Custom Marketing Research Firms Specialty Line Marketing Research Firms

Introduction to BCP and DR Planning

SMALL FARMS ARE "REAL" FARMS. John Ikerd University of Missouri

FAQ: Performance Management and Training

Accountability & Performance Measurement CHAPTER ONE: ACCOUNTABILITY AND PERFORMANCE MEASUREMENT

Identify Risks. 3. Emergent Identification: There should be provision to identify risks at any time during the project.

How to effectively commission research

THE UNIVERSITY OF TENNESSEE EXTENSION MULTISTATE AND INTEGRATED SUMMARY FY AREERA SECTIONS 105 and 204 SMITH-LEVER FUNDS

CHAPTER TWO The Consumer Research Process

International Program for Development Evaluation Training (IPDET)

Chapter 12. Consumerism: From Farm to Table Pearson Education, Inc.

Evaluation Process: Design and Implementation

M&E: Challenges and Lessons from Syria

Behaviors & Challenges

Domain Understanding and Requirements Elicitation (2)

Precision agriculture and the future of farming

New Generation Marketing. Bringing Relevance to Today s Seed Buyer

TEXAS A&M AGRILIFE EXTENSION SERVICE JOB DESCRIPTION NAME: EXTENSION AGENT-IPM

Measuring the Impact of Volunteering

Gathering Sufficient Audit Evidence

Research Evaluation to Increase Impact

Outcome Based Management

KEY TERMS & DEFINITIONS IN STRATEGIC PLANNING

Staff Guidelines for Conducting Impact Assessment

Project M&E planning guide and template

Person Job Match. Selection Report - Short Form. Name Mr Sample Candidate. Date September 18,

COMPASS CENTRE FOR SEXUAL WELLNESS STRATEGIC PLAN

Oral Statement before the United States Senate Committee on Agriculture, Nutrition, and Forestry. Hearing on the trade section of the farm bill

Gender in Project and Program. Padma Karunaratne

PROMOTING FINANCIAL WELLNESS SOLUTIONS IN THE WORKPLACE

Self-Reporting Assessment for Culturally and Linguistically Appropriate HIV Prevention Programs for Latino/Hispanic Populations

TRAINING NEEDS ANALYSIS

Talking with Consumers

RentPlan Calculating Crop Land Rental Rates

their value and contributions in the Land Grant University

Organization Realignment

Alabama Moves Ahead. Chapter 8

How to carry out market research

Full file at

most important things to remember when choosing the type of housing are provisions for

Applying Situation Appraisal to Assess Project Status

Campaign Skills Trainer s Guide. Module 11 Getting on a List Setting Personal Political Goals

Community Readiness Assessment Guidance

SOIL AND CROP SCIENCE ASSESSMENT REPORT. Michael E. Compton Program Director for Ornamental Horticulture and Soil and Crop Science.

SVQs: a guide for employers

LIVESTOCK FARMERS BRING MORE TO THE TABLE PLATE. THAN WHAT IS ON YOUR THEY ARE A PART OF THE COMMUNITY.

INDEPENDENT CONTRACTOR/EMPLOYEE STATUS 23 DETERMINATION FACTORS COMMON LAW

Behavioral Economics and Decision Making

WHO/UNICEF Joint Monitoring Programme for Water Supply and Sanitation. Policies and Procedures

Developing a Monitoring and Evaluation Plan for Food Security and Agriculture Programmes

Transcription:

Evaluating your educational initiative Murari Suvedi Michigan State University Extension June 2002

Realities of Today Fewer Dollars Increased competition Greater expectations for effectiveness Increasing scrutiny Greater need for collaboration

Some Accountability Questions: We gave you $500,000 dollars last year--what did your agency do with them? We have supported your program for 10 years, why should we continue this support? What are you doing to improve or terminate ineffective programs? What new programs need to be developed to meet the needs of the people you intend to serve?

Accountability involves evaluation and reporting Evaluation is a continual and systematic process of assessing the value or potential value of programs to guide decision-making for the program s future. Reporting of input and activities. Reporting of impacts: what difference do we make?

Why evaluate? Planning purposes Analysis of program effectiveness or quality Direct decision-making Maintain accountability Project impact assessment Advocacy

When we evaluate a project We examine the context of the project. Study its goals and objectives. Collect information about a project s input and outcomes. Compare findings to some pre-set standards. Make a value judgment about the project. Report findings to stakeholders.

Documenting Impact Impact is a clear description of value of a program to people and society. Generally, these are the longer-term benefits to client or society. It could be: Increased knowledge Improved attitudes Financial gain Production efficiencies Preservation of environmental resources Modified behavior Improved condition

Types of evaluation Formative or process evaluation: Focus on information for program improvement, modification, & management Summative or impact evaluation Focus on determining program results and effectiveness (merit and worth). Serves the purpose of making major decisions about program continuation, expansion, reduction, and funding.

When to evaluate? The timing of program evaluation: Project design stage Project start-up stage In-progress of formative evaluation Program wrap-up or summative evaluation Follow-up studies

Some Evaluation Models Targeting Outcomes of Programs (TOP) Model Program Logic Model

Targeting Outcomes of Programs (TOP) Evaluation Model (Bennett & Rockwell, 1995) *SEEC Program Development *SEEC S = Social Practices E = Economic **KOSA E = Environmental C = Conditions Reactions Participation Assess needs Activities Resources Evaluate outcomes Evaluate processes *SEEC Practices **KOSA Reactions Participation Activities Resources Program Performance **KOSA K = Knowledge O = Opinions S = Skills A = Aspirations

S I T U A T I O N INPUTS THE LOGIC MODEL SHORT TERM OUTPUTS OUTCOMES-IMPACT INTERMEDIATE TERM LONG TERM

Logic Evaluation Framework INPUTS Staff Volunteers Time Money Materials Equipment Technology Partners Activities Workshops Meetings Camps Curriculum Publication Media Web site Projects Test plots Field days Research OUTPUTS Participation Who needs to - participate? - be involved? - be reached? Number Characteristics Reactions OUTCOMES-IMPACT Short term Medium term LEARNING Awareness Knowledge Attitudes Skills Aspirations ACTION Behavior Practice Decisions Policies Social Action Long term IMPACT Social Economic Environmental Ecological Technological Context Influential factors

What is an output? Activities we need to conduct to ensure that the project/program goals are met. These could be: Workshops Publications Field days Test plots Web site

What Is an Outcome? End result or effect linked to the program Answers the so what question Can be expected or unexpected Can be positive or negative Fall along a continuum from short-term to final Often used synonymously with impact

OUTCOMES SHORT-TERM Learning Awareness Knowledge Attitudes Skills Opinions Aspirations MEDIUM Action Behavior Practice Decisions Policies Social action LONG-TERM Conditions Human Environment al Civic Motivations Adapted from Ellen Taylor-Powell, University of Wisconsin, 2000 Economic

What is an Indicator? A marker that can be observed to show that something has changed. It can help people notice changes at an early stage of project s impact. Examples: Youth attendance in programs, crime rate, home ownership

Characteristics of Good Indicators Relevant to the objectives Understandable by stakeholders Realizable, given time, dollar & resources Conceptually well founded Limited in number Easy to use and interpret Provide representative picture

Characteristics of Outcomes Derived from stakeholder and program participants (client-focused) Measurable (remember -- measurable does not necessarily refer to quantitative measurement!) Specific and clear (each target group may have different outcome) Logically linked to program action Attainable

Outcome Examples Change in awareness Change in knowledge Change in attitude, motivations or aspirations Change in skills Change in behavior Change in practice or decisions Change in policies Change in circumstance and/or system Change in human, economic, civic or environmental condition

Outcome Hierarchy System/Circumstance Behavior Skills Knowledge Awareness Attitude changes Adapted from Claude Bennett, 1976

Outcome Chain Example 12. Rural community is sustained 11. Population loss in rural community slowed 10. More jobs available in rural community 9 Profitability of local business increased Systems/ Ultimate Benefit 8. Community infrastructure enhanced 7. Rural economic activity (tax base) increased. 6. Farm profits used to support rural economy Behavior 5. Overall farm profitability increased 4. Farmers increase income 3. Farmers adopt alternative marketing practices 2. Farmers include alternative marketing options in planning 1. Farmers learn about alternative marketing practices Learning

Examples of Outcome Indicators Nutrient Use and Management Nitrogen fertilizer use: Amount of decrease/increase lbs/acre Use of Cover Crops: Amount of decrease/increase acres Well water quality: Change in nitrate/pesticide levels ppm

Examples of Outcome Indicators Ag Chemical, Pest and Weed Management Herbicide use: lbs ai/acre Use of IPM techniques: no. of acres Weed incidence: % of field crop or % herd affected Insect pest incidence: % field crop or % herd affected Disease incidence: % field crop or % herd affected

Examples of Outcome Indicators Quality of life/social benefits No. work hours per day: hrs/acre or head Time for community activities: hrs/week Marketing of farm produce locally: % of total Personal & family health: no. of sick days/yr

Evaluation Data Collection Several methods are out there, You have to be objective.

Be Precise About What do you actually need to know. Don t be vague, biased, or non-critical Think in terms of results

Focus Groups Organized discussions led by a moderator. Involve 8-10 people. Stimulate thinking and elicit ideas about a specific topic. Seek ideas from those who will use the results. Used to generate ideas or assess needs.

Choosing A Survey Method Mail surveys Telephone interviews Face-to-face interviews Drop-off surveys

Factors to Consider in Choosing the Method Resources available (people, time, $) Experience and expertise Facilities at your disposal Sensitivity of the method to various kinds of errors

Mail Surveys Require least amount of resources Easiest to do. Sampling error could be minimized at low cost. Provides a sense of privacy to respondents. Less sensitive to biases introduced by researchers/interviewers.

Mail Surveys: Weaknesses Sensitivity to non-coverage error. Non-response error tends to be high. Not appropriate with less educated or illiterate population. Researcher has little control on the quality of response. Potential for item non-response problems

Telephone Interviews: Strengths Ability to produce results quickly. Easy to deal with problems that may arise. Cost (between face-to-face & mail surveys).

Telephone Interviews: Weaknesses Not all people have telephone Telephone directories are incomplete Depend on what can be communicated orally. Interviewer could introduce bias. Tendency of respondents to give socially acceptable answers.

Telephone Interview is Appropriate when: Members of population have telephones Questions are relatively straightforward Experienced help is available Quick turnaround is important

Face-to-face interviews Popular before 1970s Credible source of information High response rate Accurate data

Face-to-face interviews: strengths Suited when: No population lists are available. When people are not likely to respond by phone or mail. Low education level. Complex questionnaire. Well-funded projects with experienced interviewers and professional help are available.

Drop-off surveys People deliver questionnaire by hand to households/businesses. Respondents complete on their own and return by mail or leave them out to be collected. Well-suited for: Small community/neighborhood surveys Short and simple questionnaire Projects with small staff but relatively large sample size

Evaluation Planning (Group Exercise) Outcomes of the project Indicators to be used Method of gathering data When will data be collected? Who will collect data? How analyzed? How do you plan to share evaluation findings to stakeholders?

Writing Impact Statements Consider: Who is your audience? How many participated in what program? What was the change (Reaction? Knowledge? Skill/Behavior? Profits? What proportion of participants will have what type of change? What is the time frame to have that change?

Example of impact statement Half of 120 beef producers in East Central region who participated in record keeping workshop in Winter 2001 adopted a computer software within a year to keep farm records. Membership of Pork Producer Association received weekly updates on hog markets and 25% of them indicated that this information was very helpful in their marketing decision of pork.

Success Stories Mary is now 19 years old and about to enter the local university on a full scholarship. When she was referred to our program 3 years ago, her future looked bleak. She had quit school, was hanging out in the streets, and was drinking heavily.. She participated actively in group counseling, bounded with one of our staff, gave up drinking and smoking, delivered a healthy baby, completed her high school education, got a job, moved into her own apartment, and competed successfully for the scholarship.

Evaluate Your Program s Outcomes Refer to How to conduct evaluation of Extension programs we have provided Contact us if you need help: Planning your evaluation. Data analysis and report writing.