Evaluating your educational initiative Murari Suvedi Michigan State University Extension June 2002
Realities of Today Fewer Dollars Increased competition Greater expectations for effectiveness Increasing scrutiny Greater need for collaboration
Some Accountability Questions: We gave you $500,000 dollars last year--what did your agency do with them? We have supported your program for 10 years, why should we continue this support? What are you doing to improve or terminate ineffective programs? What new programs need to be developed to meet the needs of the people you intend to serve?
Accountability involves evaluation and reporting Evaluation is a continual and systematic process of assessing the value or potential value of programs to guide decision-making for the program s future. Reporting of input and activities. Reporting of impacts: what difference do we make?
Why evaluate? Planning purposes Analysis of program effectiveness or quality Direct decision-making Maintain accountability Project impact assessment Advocacy
When we evaluate a project We examine the context of the project. Study its goals and objectives. Collect information about a project s input and outcomes. Compare findings to some pre-set standards. Make a value judgment about the project. Report findings to stakeholders.
Documenting Impact Impact is a clear description of value of a program to people and society. Generally, these are the longer-term benefits to client or society. It could be: Increased knowledge Improved attitudes Financial gain Production efficiencies Preservation of environmental resources Modified behavior Improved condition
Types of evaluation Formative or process evaluation: Focus on information for program improvement, modification, & management Summative or impact evaluation Focus on determining program results and effectiveness (merit and worth). Serves the purpose of making major decisions about program continuation, expansion, reduction, and funding.
When to evaluate? The timing of program evaluation: Project design stage Project start-up stage In-progress of formative evaluation Program wrap-up or summative evaluation Follow-up studies
Some Evaluation Models Targeting Outcomes of Programs (TOP) Model Program Logic Model
Targeting Outcomes of Programs (TOP) Evaluation Model (Bennett & Rockwell, 1995) *SEEC Program Development *SEEC S = Social Practices E = Economic **KOSA E = Environmental C = Conditions Reactions Participation Assess needs Activities Resources Evaluate outcomes Evaluate processes *SEEC Practices **KOSA Reactions Participation Activities Resources Program Performance **KOSA K = Knowledge O = Opinions S = Skills A = Aspirations
S I T U A T I O N INPUTS THE LOGIC MODEL SHORT TERM OUTPUTS OUTCOMES-IMPACT INTERMEDIATE TERM LONG TERM
Logic Evaluation Framework INPUTS Staff Volunteers Time Money Materials Equipment Technology Partners Activities Workshops Meetings Camps Curriculum Publication Media Web site Projects Test plots Field days Research OUTPUTS Participation Who needs to - participate? - be involved? - be reached? Number Characteristics Reactions OUTCOMES-IMPACT Short term Medium term LEARNING Awareness Knowledge Attitudes Skills Aspirations ACTION Behavior Practice Decisions Policies Social Action Long term IMPACT Social Economic Environmental Ecological Technological Context Influential factors
What is an output? Activities we need to conduct to ensure that the project/program goals are met. These could be: Workshops Publications Field days Test plots Web site
What Is an Outcome? End result or effect linked to the program Answers the so what question Can be expected or unexpected Can be positive or negative Fall along a continuum from short-term to final Often used synonymously with impact
OUTCOMES SHORT-TERM Learning Awareness Knowledge Attitudes Skills Opinions Aspirations MEDIUM Action Behavior Practice Decisions Policies Social action LONG-TERM Conditions Human Environment al Civic Motivations Adapted from Ellen Taylor-Powell, University of Wisconsin, 2000 Economic
What is an Indicator? A marker that can be observed to show that something has changed. It can help people notice changes at an early stage of project s impact. Examples: Youth attendance in programs, crime rate, home ownership
Characteristics of Good Indicators Relevant to the objectives Understandable by stakeholders Realizable, given time, dollar & resources Conceptually well founded Limited in number Easy to use and interpret Provide representative picture
Characteristics of Outcomes Derived from stakeholder and program participants (client-focused) Measurable (remember -- measurable does not necessarily refer to quantitative measurement!) Specific and clear (each target group may have different outcome) Logically linked to program action Attainable
Outcome Examples Change in awareness Change in knowledge Change in attitude, motivations or aspirations Change in skills Change in behavior Change in practice or decisions Change in policies Change in circumstance and/or system Change in human, economic, civic or environmental condition
Outcome Hierarchy System/Circumstance Behavior Skills Knowledge Awareness Attitude changes Adapted from Claude Bennett, 1976
Outcome Chain Example 12. Rural community is sustained 11. Population loss in rural community slowed 10. More jobs available in rural community 9 Profitability of local business increased Systems/ Ultimate Benefit 8. Community infrastructure enhanced 7. Rural economic activity (tax base) increased. 6. Farm profits used to support rural economy Behavior 5. Overall farm profitability increased 4. Farmers increase income 3. Farmers adopt alternative marketing practices 2. Farmers include alternative marketing options in planning 1. Farmers learn about alternative marketing practices Learning
Examples of Outcome Indicators Nutrient Use and Management Nitrogen fertilizer use: Amount of decrease/increase lbs/acre Use of Cover Crops: Amount of decrease/increase acres Well water quality: Change in nitrate/pesticide levels ppm
Examples of Outcome Indicators Ag Chemical, Pest and Weed Management Herbicide use: lbs ai/acre Use of IPM techniques: no. of acres Weed incidence: % of field crop or % herd affected Insect pest incidence: % field crop or % herd affected Disease incidence: % field crop or % herd affected
Examples of Outcome Indicators Quality of life/social benefits No. work hours per day: hrs/acre or head Time for community activities: hrs/week Marketing of farm produce locally: % of total Personal & family health: no. of sick days/yr
Evaluation Data Collection Several methods are out there, You have to be objective.
Be Precise About What do you actually need to know. Don t be vague, biased, or non-critical Think in terms of results
Focus Groups Organized discussions led by a moderator. Involve 8-10 people. Stimulate thinking and elicit ideas about a specific topic. Seek ideas from those who will use the results. Used to generate ideas or assess needs.
Choosing A Survey Method Mail surveys Telephone interviews Face-to-face interviews Drop-off surveys
Factors to Consider in Choosing the Method Resources available (people, time, $) Experience and expertise Facilities at your disposal Sensitivity of the method to various kinds of errors
Mail Surveys Require least amount of resources Easiest to do. Sampling error could be minimized at low cost. Provides a sense of privacy to respondents. Less sensitive to biases introduced by researchers/interviewers.
Mail Surveys: Weaknesses Sensitivity to non-coverage error. Non-response error tends to be high. Not appropriate with less educated or illiterate population. Researcher has little control on the quality of response. Potential for item non-response problems
Telephone Interviews: Strengths Ability to produce results quickly. Easy to deal with problems that may arise. Cost (between face-to-face & mail surveys).
Telephone Interviews: Weaknesses Not all people have telephone Telephone directories are incomplete Depend on what can be communicated orally. Interviewer could introduce bias. Tendency of respondents to give socially acceptable answers.
Telephone Interview is Appropriate when: Members of population have telephones Questions are relatively straightforward Experienced help is available Quick turnaround is important
Face-to-face interviews Popular before 1970s Credible source of information High response rate Accurate data
Face-to-face interviews: strengths Suited when: No population lists are available. When people are not likely to respond by phone or mail. Low education level. Complex questionnaire. Well-funded projects with experienced interviewers and professional help are available.
Drop-off surveys People deliver questionnaire by hand to households/businesses. Respondents complete on their own and return by mail or leave them out to be collected. Well-suited for: Small community/neighborhood surveys Short and simple questionnaire Projects with small staff but relatively large sample size
Evaluation Planning (Group Exercise) Outcomes of the project Indicators to be used Method of gathering data When will data be collected? Who will collect data? How analyzed? How do you plan to share evaluation findings to stakeholders?
Writing Impact Statements Consider: Who is your audience? How many participated in what program? What was the change (Reaction? Knowledge? Skill/Behavior? Profits? What proportion of participants will have what type of change? What is the time frame to have that change?
Example of impact statement Half of 120 beef producers in East Central region who participated in record keeping workshop in Winter 2001 adopted a computer software within a year to keep farm records. Membership of Pork Producer Association received weekly updates on hog markets and 25% of them indicated that this information was very helpful in their marketing decision of pork.
Success Stories Mary is now 19 years old and about to enter the local university on a full scholarship. When she was referred to our program 3 years ago, her future looked bleak. She had quit school, was hanging out in the streets, and was drinking heavily.. She participated actively in group counseling, bounded with one of our staff, gave up drinking and smoking, delivered a healthy baby, completed her high school education, got a job, moved into her own apartment, and competed successfully for the scholarship.
Evaluate Your Program s Outcomes Refer to How to conduct evaluation of Extension programs we have provided Contact us if you need help: Planning your evaluation. Data analysis and report writing.