Note... This section is taken from Cookingham, Evaluation model as a concept, 2012; Retrieve from Evaluation models or approaches.

Similar documents
Methodologically Inclusive Transformative Evaluation as an Enabler of Social Inclusion

Indiana Evaluation Association October 20, 2017

S & T Management Core Competency Profile

MACRO Student Evaluation

Smile sheets are not enough: An effective, systematic approach to program and educational evaluation using CIPP

The UTS Social Impact Framework

1. How have you engaged, convened, and maintained relationships with your community/communities?

Policy evaluation in the context of reflexive governance

Standards of Excellence in Civic Engagement

SOCW 6379: CONCENTRATION MACRO FIELD Educational Agreement and Evaluation Forms

New Zealand Aid Programme Results-Based Management Toolkit

#impactequitync. North Carolina Health Equity Impact Assessment

I. Skills and Knowledge

Department of Social Work. MSW Multicultural Macro Practice Concentration (MCMP) Student Learning Agenda and Assessment

Profile of the 21 st Century Leader

CORE COMPETENCIES. For all faculty and staff

Social Work Field Education Learning Plan

LSICD04 (SQA Unit Code - FX5A 04) Demonstrate competence and integrity as a Community Development practitioner

LSICD05 (SQA Unit Code - FX5W 04) Maintain Community Development practices within own organisation

Lehman College / CUNY Department of Social Work

Lehman College / CUNY Department of Social Work

Lehman College / CUNY Department of Social Work

Module 1 BACKGROUND GENDER AND FEMALE TALENT

PROGRAM EVALUATIONS METAEVALUATION CHECKLIST (Based on The Program Evaluation Standards) Daniel L. Stufflebeam, 1999

Poster 498 Front-end-Evaluation Planning: Articulating Purpose and Key Questions

Lehman College / CUNY Department of Social Work

Program Assessment. University of Cincinnati School of Social Work Master of Social Work Program. August 2013

Lehman College / CUNY Department of Social Work

WTAMU Graduate Field Evaluation (Foundation)

Field Supervisor: Faculty Liaison:

Marketing Accountability

Face It! Face Matters when Communicating. A crash course in the concept of face and how it affects us as we communicate across cultures.

SW 701 Foundation Field Practicum. Learning Contract Supplement: A Guide to Completing the Learning Contract

STUDENT: Phone Number: Placement Phone Number: UW Address: Field Hour Schedule:

A Cross-Cultural Investigation of the Use of Knowledge Management Systems

ASSESSMENT OF ADVANCED MSW PROGRAM COMPETENCIES 2015 %=5 %=4 =>4

Power and Diversity in the Boardroom: Creating More Representative Boards. Raja G. Khouri & Pat Bradshaw

Coaching and Mentoring

Field Instructor Evaluation Rating (1 item, mean 3

DNA 25. Dina Sample. Talent. ABC Corp NEW 25. Copyright Target Training International, Ltd.

GUIDANCE ON CHOOSING INDICATORS OF OUTCOMES

Building Effective Cross-Cultural Teams DANIEL AREVALO, CPP ASIS 2017, DALLAS, TEXAS

CARSON-NEWMAN UNIVERSITY GRADUATE STUDIES IN COUNSELING

Position Paper: IBIS and Rights Based Approaches Approved by the Board of IBIS

Peel District School Board POLICIES AND REGULATIONS Policy 54

Strengthening Evaluation Through Cultural Relevance and Cultural Competence

The first responsibility of a leader is to define reality.

WHAT IS CULTURAL COMPETENCE? (Adapted from Sue & Sue)

Methods of Policy Analysis Notes from Evaluation by Carol Weiss

STUDENT: Phone Number: Placement Phone Number: UW Address: Field Hour Schedule:

Effectiveness Assessment Tool for High-Performance Social Change Organizations: An Overview

1. to advocate for client access to services. 2. to practice personal reflection and selfcorrection

architecture for gender transformative programming and measurement: a primer

Social Cohesion Concepts and Measurement. Ghimar Deeb UNDP Social Cohesion and Community Security Technical Specialist

University of Georgia School of Social Work. BSW Program Summary of Student Learning Outcomes Assessment

A REVIEW OF THE LITERATURE ON TRANSFORMATIONAL LEADERSHIP. KEYWORDS: transformational leadership, behavior, engagement, globalization

International Program for Development Evaluation Training (IPDET)

Form: "Activities & Target Dates - Foundation" Created with: Taskstream Author: University at Buffalo Social Work Manager

2015 Social Work Competencies for Generalists

JOINT SOCIAL WORK FIELD EDUCATION PROGRAM North Carolina Agricultural and Technical State University Department of Sociology and Social Work

Yolanda Suarez-Balcazar, PhD University of Illinois at Chicago February, 2013

Radford University 489 Learning Agreement

BOARD DIVERSITY & INCLUSION

STOCKTON UNIVERSITY MSW PROGRAM MISSION, GOALS, COMPETENCIES, AND PRACTICE BEHAVIORS

SHEET 12 GENDER AND EVALUATION

GREATER ESSEX COUNTY DISTRICT SCHOOL BOARD

(NAME) BACCALAUREATE SOCIAL WORK PROGRAM. ASSESSMENT OF STUDENT LEARNING OUTCOMES ( Academic Year) LAST COMPLETED ON July

UNIVERSITY OF ARKANSAS AT MONTICELLO Department of Social Work. B.S.W. Program PROFESSIONAL DEVELOPMENT PLAN (Learning Plan)

Framework for Program Evaluation in Public Health: A Checklist of Steps and Standards

BASW Field Evaluations. Senior Internship, AY Methods

Messiah College Field Coordinator: Michelle George, MSW, LCSW, BCD. Messiah College Field Liaison(if applicable):

This glossary will be available to registered Point K users online at beginning in mid-may, 2005.

(NAME) BACCALAUREATE SOCIAL WORK PROGRAM. ASSESSMENT OF STUDENT LEARNING OUTCOMES ( Academic Year) LAST COMPLETED ON July

Participation and Empowerment:

Rating Scale for Evaluation of Field Placement Performance [EPAS 2015] Name of Intern Date

The Strategy Alignment Model: Defining Real Estate Strategies in the Context of Organizational Outcomes

Pario Sample 360 Report

Gender Transformative Change. Exploring its dimensions Evaluation of ESCWA s Centre for Women

Assessment of Competencies

Positive Organizational Change A New Approach to Managing the Change Curve

Competency 3: Advance Human Rights and Social, Economic, and Environmental Justice

Development and Validation of a Framework for Evaluation of Integrated Design Teams of Sustainable High-Performance Buildings

Evaluation, Evaluators, and the American Evaluation Association

Competencies Checklist for CE. Tier 1 Core Public Health Competencies Checklist

Jennifer C. Greene EVALUATION AS A SOCIO-POLITICAL INTERVENTION

Developing Evaluation Criteria

Integrating Gender Equality and Women s Empowerment into an Activity, Programme or Policy

Core principles for assessing effectiveness: A NGO approach to evidencing change

Engaging the Six Cultures of the Academy William H. Bergquist About the Kenneth Pawlak Author

Customer Involvement. The Challenge to Go Beyond the Minimum

Creating the Future: Strategic Planning for Schools

ONTARIO FEDERATION. OF SNOWMOBILE CLUBS Welham Road, Barrie Ontario Canada L4N 8Z6 Tel: Fax: Executive Director

Influence of Transformational Leadership, Organizational Culture and Trust on Organizational Commitment

LEARNING AGREEMENT

Technical Note Integrating Gender in WFP Evaluations

Marilee Bresciani, Professional Standards Division Director Report for the 2011 March Board of Directors Meeting.

On evaluation and accountability

IPLAC CONFLICT ANALYSIS GUIDE

Integrating Diversity with Effective Group Processes and Mindset for More Productive Teams, Committees, Task Forces, and PLCs

Quality Culture In Universities And Influences On Formal And Non-Formal Education

Transcription:

Mertens Model of Transformative Evaluation Frank Cookingham, September 2017 Using a framework to describe an evaluation model. Note... This section is taken from Cookingham, Evaluation model as a concept, 2012; Retrieve from www.evalfrank.com, Evaluation models or approaches. The concepts of framework, approach (Stufflebeam, 2001) or model (Madaus, Scriven and Stufflebeam, 1983), and design are hierarchical. Different evaluators may describe them differently; I find it helpful to see the hierarchy moving from abstract framework to generic model to a specific design (evaluation plan). Framework. A framework is a conceptual structure for organizing ideas. The five fundamental issues that undergird practical program evaluation (Shadish, Cook and Leviton, 1991) is an example of a framework. The five issues are (p.35): Social programming What are the important problems this program could address? Can the program be improved? Is it worth doing so? [If it is not worth improving], what is worth doing from an evaluation perspective? Knowledge use How can I make sure my results get used quickly to help this program? Do I want to focus on rapid use of results? If not, can my evaluation be useful in other ways? Valuing Is this a good program? By which notion of good? What justifies the conclusion about the goodness of the program? Knowledge construction How do I know all this? What counts as a confident answer to such a question? What causes that confidence? Evaluation practice Given my [limits], and given the seemingly unlimited possibilities, how can I narrow my options to do a feasible evaluation? What is my role [worth] as educator, methodological expert, or judge of program? What questions should I ask, and what methods should I use [to answer them, or to discover other important questions]? Other frameworks can be structured around approaches to inquiry (case study, experimental, survey, focus groups, etc.) or purpose of evaluation (add to a body of knowledge, determine attribution, demonstrate accountability for achieving goals, etc.). Model. A model is the detailed description of an evaluation approach to dealing with the five issues. See Stufflebeam (2001) for examples of a variety of models and then describe those of interest in terms of these five fundamental issues. Design. This is a description of what data will be collected, how it will be collected, and how it will be analyzed as a particular model, or combination of models, is applied to the evaluand. Application. I have organized the evaluation framework described above into a graphic that can show at a glance the defining features of an evaluation model. Mertens Model for Evaluation Page 1 of 6

General Framework for an Evaluation Model Surrounding Context: Obstacles to Change beyond Program Control 1 Guiding Values 2 3 Surrounding Context: Assets for Change Assumptions about Reality Evaluation Activities, Methodology 4 7 Use of Findings 5 6 8 9 Theory of Social Change: Interventions for a Better Future The General Framework simply reorganizes the five components described above to show that evaluation activities (the center of the framework) are determined by the evaluator s understanding of: Surrounding context within which the program is embedded and the underlying assumptions that support the implementation of the program (theory of social change). Two clusters of factors in the context are the obstacles to change (risks) and the assets for change in the program setting. This is the domain of the social programming component. Values that guide interpreting evidence to conclude to what extent the program is good and bad, helpful to some stakeholders and harmful to others, worth the cost, etc. Assumptions about reality and how you know what is real is the essence of the knowledge construction component. Different sets of assumptions in this area define very different evaluation activities. Utilization factors that influence how different groups will interpret and use conclusions and recommendations. The purpose of having three aspects for guiding values, assumptions about reality and use of findings is to provide a visual overview of the defining features for a model. There is nothing magical about three aspects per supporting component. The intellectual effort to identify the nine aspects that influence evaluation activities may help one understand what differentiates one model from others beyond the particular terminology that advocates use. In a particular case seven aspects may be sufficient while in another case twelve aspects may be necessary. Mertens Model for Evaluation Page 2 of 6

Mertens Transformative Research and Evaluation as a model. I have prepared the following description of Mertens (2009) model of evaluation based on the features that distinguish it as a model for me. Tarsilla s (2010) article helped me organize the content in the model. Evaluation Model Mertens Transformative Research and Evaluation Lives and experiences of those who live at the margins are most important Crystallization via participatory mixed methods by widely diverse evaluators and participants who trust each other Truth is determined by privilege and power; valid knowledge requires descriptions of multiple social constructions Use findings to promote human rights and social justice Mertens Model for Evaluation Page 3 of 6

Theory of social programming. The main objective of transformative evaluators work is to bring about not only social change but also social transformation (i.e., a more radical and structural modification of attitudes, behaviors, and mentality in society) so as to counteract inequalities existing in today s world. The evaluator should focus on furthering human rights The evaluators primary task should be to contribute to the solution of intransigent social problems and that, in doing that, they should also challenge the status quo, as necessary (Tarsilla, 2010, p108). Emphasis is placed on community strengths and obstacles to communities becoming effective change agents. Communities gain strength when their rights are honored and respected. Guiding values. The evaluation needs to be imbued [permeated] with the values of the communities where the evaluation is done (Tarsilla, 2010, p103). These same values should guide the entire exercise from planning to reporting findings. Places central importance on lives and experiences of those who live at the margins. Every person must be treated with dignity and respect and avoidance of harm must be the primary principle. Whatever benefits come to the evaluator (including royalties from publications) should be shared with those who participated in the evaluation. Authentic representation of diversity in understanding Presentation of a balanced view of perspectives, values and beliefs. The evaluator should show what he or she did to be trusted by the participants in the evaluation, and to develop a sense of mutuality. Assumptions about reality and true knowledge. All knowledge reflects power, privilege and social relationships in society (Tarsilla, p105). Privilege influences accepting one version of reality over another. Critical subjectivity is the path toward the ideal of true objective knowledge. Evaluators need to question their preconceived notions about how the evaluation will produce sound knowledge for the setting that provides context for the evaluand. This is done through ongoing self-reflection and stakeholder participation throughout the exercise. Cultural competence is required but is not attainable by an evaluator working outside his or her culture. To allow a better understanding of the complexities of the local culture, include local evaluators on the evaluation team. (Tarsilla, p105, 107) Appreciative inquiry reveals important aspects of reality that are often overlooked. Within a culture there are multiple realities that are socially constructed. Participatory approaches are important for revealing views of reality from a variety of perspectives. Constellations of values shape perceptions of reality. Use of evaluation findings. Use of findings to promote human rights and further social justice is the most critical factor in transformative work. This involves developing, critiquing, and refining policy as well as in advocating actions that support changes in policy (Tarsilla, p109). Mertens Model for Evaluation Page 4 of 6

Community members that provide information in an evaluation exercise have some sort of ownership for the information. This means that their ideas about using the evaluation findings must be honored. Use a multitude of options for reporting. This is important for addressing power issues between academic stakeholders and community stakeholders, which is an important sub-objective in an evaluation exercise. Findings should extend participant knowledge of program operations and social relationships, especially regarding privilege and power. Evaluation activities, methodology. Prior to collecting data learn as much as you can about the culture of the communities (study relevant literature, talk with evaluators who have done evaluation in the culture, and read evaluation reports from that culture). Throughout the exercise check with local partners who are conversant in the local language and English about your understanding of relevant aspects of culture. Methods of inquiry give priority to inclusive stakeholder involvement in the exercise. Evaluators should seek out those who have not been heard either because do not want to speak up, or because they are out of sight (marginalized). Mixed qualitative and quantitative methods are the norm for transformative evaluation. Think of the evaluation methodology as a crystal with many facets with each facet representing a voice. Crystallization rather than triangulation characterizes the methodology. Dialogical methods are critical. Interaction between evaluator and participants to identify the focus and the information to be collected. Cultural complexity is accommodated. Context and history are examined especially regarding discrimination and oppression. Deconstruct the concept of validity. The Evaluator must have a heightened self-awareness and a cultivated critical subjectivity; he or she should be open to being transformed by the Evaluation exercise as it unfolds. References Madaus, George. F., Scriven, Michael, & Stufflebeam, Daniel L. (Eds.). (1983). Evaluation models: Viewpoints on educational and human services evaluation. Boston, Massachusetts USA: Kluwer-Nijoff. Mertens, Donna M. (2009). Transformative research and evaluation. New York, New York: The Guildford Press. Shadish, William R., Cook, Thomas D., & Leviton, Laura C. (1991). Foundations of program evaluation: Theories of practice. Thousand Oaks, California USA: Sage Publications. Stufflebeam, Daniel L. (2001). Evaluation models. New Directions for Evaluation, 89. San Francisco, California: Jossey-Bass. Mertens Model for Evaluation Page 5 of 6

Tarsilla, Michele. (2010). Can the Transformative agenda really alter the status quo? A conversation with Donna M. Mertens. Journal of MultiDisciplinary Evaluation, 6(14), 102-113. Mertens Model for Evaluation Page 6 of 6