Photo: Laura Berman Tips, Tools & Telling The Story: Evaluating Community Food Initiatives Meredith Davis Research & Evaluation Manager Community Food Centres Canada September 13, 2012
CFCC s Evaluation Strategy High burden of proof Proving: Measuring the collective impact of CFCs (healthy food access, skills & eating, social inclusion, civic engagement) Improving: Gathering regular feedback to maximize our impact & staying on top of emerging best practices evidence Centralized Evaluation: Easy tools; training & support; consistent measurement
Annual Program Survey Agency-wide 150 in-person interviews Information: demographics, outcomes, impacts and feedback Self-reporting: avoid positive bias Capture quotes Eliminate barriers to participation Report back 9/17/2012
Theory of Change A road map to get us from here to there It is very difficult to evaluate the success of an initiative unless you are clear about what you are ultimately trying to achieve Requires a shared understanding among key stakeholders If then statements
Program Logic Model A visualization of a program s theory of change INPUTS ACTIVITIES OUTPUTS OUTCOMES INDICATORS IMPACTS Resources that we invest (e.g. program staff & volunteers, garden space & tools, gardening knowledge, participants) What we do (e.g. share harvested food between garden members and emergency food programs) Numbers we count in our programs that tell us about the level of activity taking place (e.g. kilograms of food harvested) Short-term changes in knowledge, skills, awareness, etc. and mediumterm changes in behaviour, practice, etc. (e.g. increased consumption of fruits and vegetables) How we measure success (e.g. # of daily servings of fruit and vegetables) Hoped-for long-term changes in systems or conditions (e.g. improved physical health)
Some Evaluation Principles Clarify the objectives of your evaluations. Simplify evaluations as much as possible. Some questions are best answered at an agency-wide level and others at a programspecific level.
Developing Evaluation Tools Questions to Consider: 1. What do I want to know and why? 2. Barriers to participation? And how can I remove them? 3. Written or verbal? Tool examples: File review process, questionnaires, focus groups, interviews, pre-post tests
Indicator Selection OUTPUT INDICATORS OUTCOME INDICATORS IMPACT INDICATORS Numbers we count in our programs that tell us about the level of activity taking place How we measure short- and mediumterm program success How we measure the long-term success of an initiative Indicator Criteria Appropriate, Measurable, Comparable, Compelling, Participatory, Sensitive to change
Developmental Evaluation (DE) Capture lessons and adapt in realtime Uses: innovation & exploration Reflections & individual written and group discussions Role of DE evaluator: a critical friend (Source: McConnell DE Primer) Needs buy-in from all involved and a plugged-in and strong evaluator
Social Return on Investment (SROI) Analysis The Pathways to Education Example Source; Boston Consulting Group 1 charitable dollar invested = $25 of social value created!
Communicating Evaluation Findings Make sure your evaluation doesn t sit on a shelf! Consider your audience Make your results accessible, interesting & engaging Example: The Stop s Program Infographics
Evaluating Respectfully Filters: How is this going to be received by program participants and other stakeholders? Is there any way that participating in this evaluation can negatively impact them? How can we evaluate in a way that will always benefit our participants and maintain and foster their sense of dignity above all else?
Building an Evaluative Culture Requires a commitment from the top to the bottom. Photo: Laura Berman
Questions?
Thank You! Community Food Centres Canada Website: www.cfccanada.ca Meredith Davis Research & Evaluation Manager t: 416-531-8826 x. 224 e: meredith@cfccanada.ca