Social Psychology Theory and Methodology for Programme Evaluation

Size: px
Start display at page:

Download "Social Psychology Theory and Methodology for Programme Evaluation"

Transcription

1 Federal State Autonomous Educational Institution of Higher Education National Research University Higher School of Economics Faculty of Social Sciences School of Political Science Course syllabus Social Psychology Theory and Methodology for Programme Evaluation For Bachelor s Degree Program Political Science Author: Elena Sautkina, PhD, Professor, elena.v.sautkina@gmail.com Approved at the meeting of the School of Political Science 2018 Head of the School: Andrei Melville 2018 Recommended by the Academic Council of Educational Program 2018, Document No. Approved 2018 Academic Supervisor of Educational Program: Ilya Lokshin 2018, Moscow This syllabus cannot be used by other departments of the university and other institutes of higher education without the permission of the faculty that developed it.

2 1. Scope of application and reference to regulatory documents The present syllabus establishes requirements and course objectives for students, defines the content of lectures and seminars, and defines criteria for assessment of students knowledge. The syllabus is designed for professors teaching this course, teaching assistants and students enrolled in the Bachelor s Degree Program Political Science and studying the course entitled: Social Psychology Theory and Methodology for Programme Evaluation. The syllabus has been designed to meet: The Educational Standard developed by the National Research University Higher School of Economics for the Bachelor s Degree Program Political Science ; The Educational Program developed by the National Research University Higher School of Economics for the Bachelor s Degree Program Political Science ; The Curriculum of the Bachelor s Degree Program Political Science approved in Course Objectives The course has the following main objectives: Develop knowledge and skills in theory and methodology of Social Psychology that are commonly used in Evaluation of Social Programmes and Interventions. Develop knowledge and skills in design and running of evaluation studies. Gain insight into the 'evaluation loop': from identifying policy needs for evidence to commissioning evaluation research, and from conducting research to feeding new evidence back into policies and programmes. 3. Students Competencies upon Completion Students will develop competences in the following main areas: Theory and methods of Social Psychology that are used in Evaluation of Social Programmes and Interventions. Quality of Evaluation research. Best practice and international standards in Evaluation. Design and running of an Evaluation research. Evaluation loop : linking evaluation research, policy and practice. Evidence-based policy. Collaborations between the various actors of the evaluation loop. International developments and debates in the field of evaluation of social programmes. The course will contribute to developing the following competences: Competency code UC-2 UC-3 Description Ability to identify the scientific essence of given problems in one s professional area of focus. Ability to solve problems in one s professional area of focus on the basis of analysis and synthesis

3 UC-5 UC-6 UC-9 UC-10 PC-1 PC-2 PC-4 PC-9 Ability to work with information: find, evaluate and use information from different sources, as required for solving research and professional tasks (e.g., on the basis of a systemic approach) Ability to engage in research, including analysis of problems, setting goals and objectives, identifying the subject and focus of a given study, as well as selecting research approaches and methods, and evaluation of research quality Ability to think critically and interpret one s experience and practices, as well as analyse one s own professional and social experience Ability to engage in productive and/or applied activities in an international context Ability to identify and establish an issue/problem for the analysis of political phenomena and processes, as well as determine research objectives and put together a research plan Ability to select and apply research methods, which are adequate to the defined goals Ability to search for, gather, process, analyze and store information required for the achievement of given objectives Ability to document research and analytical outcomes, based on the results of scientific and applied research, in various academic papers (e.g., reviews, analytical reports, publications on social and political topics, etc.), depending on the target audience In order to develop these competencies, students will engage in: Listening to lectures and seminars and reading literature in English; Reading and analysing evaluation studies published in high impact journals; Holding group discussions (in English, where possible); Designing evaluation studies; Developing, writing up and presenting projects in groups. 4. The Course in the Structure of the Educational Program The course belongs to a series of elective professional courses. Teaching, communication and examination will be done in English, with possibility to make the necessary clarifications in Russian. The course is related to the following professional courses: Psychology Political management Formation of the State Policy in Russia and Foreign Countries Data analysis and methodology courses. The course will be of use for studying the following courses: Political Management (year 3) Formation of the State Policy in Russia and Foreign Countries (year 3)

4 Institutional Economics (year 3) Systems of Public Service (year 4) Social Policy of Russia (year 4) Quantitative Methods and Models of State Effectiveness Evaluation (year 4) Project Proposal (year 4). 5. Course Structure Topic Class hours Independent work hours Lectures Seminars Introduction to Evaluation Research Behaviour Change Theories and Beyond Complexity in Evaluation Social Psychology and Social Science Methods for Evaluation of Programmes and Interventions Designing and Conducting an Evaluation Study The Evaluation Loop TOTAL HOURS Knowledge Assessment Students knowledge will be assessed using a combination of: 1) formative (continuous) assessment and 2) summative (final) assessment. Assessment will be focussed on gradual learning (knowledge and skill acquisition) rather than control and competition Continuous Individual Assessment Every week students will engage in various forms of short assessment tasks: problem solving, quizzes, journals, mini-essays or mini-presentations based on the contents of the previous week (both in-class and independent work). Students work will be assessed individually, and scores communicated on a weekly basis Final Small Group Assessment Throughout the course, students will work in small groups in order to produce their own evaluation projects. At the end of the course, students will present their projects in written, as well as orally. The project report will normally be pages long. It will consist of an introduction, a short literature review, definition of research aims and questions, hypotheses, outcomes and impact, research tools, and conclusion. The oral presentation of the project will take the form of a role game: team members will divide into evaluators, policy-makers and practitioners. Students will be asked to produce a short (10-15 minutes)

5 visual presentation of the project. They will also prepare a critical discussion of this project from the point of view of policy-makers and practitioners Conditions of Knowledge Assessment Knowledge assessment will be done in English, with the possibility to make the necessary clarifications in Russian. Students knowledge will not be assessed on the basis of their English language skills. Knowledge will be assessed on a 10-point scale: 1 - very unsatisfactory 2 - very bad 3 - bad 2 - unsatisfactory 4 - satisfactory 5 - very satisfactory 3 - satisfactory 6 - good 7 - very good 4 - good 8 - nearly excellent 9 - excellent 10 - brilliant The final grade will consist of results of the Continuous Individual Assessment and Final Small Group Assessment. It will be calculated using the following formula: Final grade = 0.6*Continuous Individual Average + 0.4*Final Small Group Score (0.2 Project Report Presentation). 7. Course Contents and Reading 7.1. Introduction to Evaluation Research Evaluation: definitions and history. Specificity of evaluation research compared to other social research. Evidence-based policy. Main actors in evaluation. Programme outcomes and programme effects. Research questions in evaluation. Programme theory. Logic models. Target populations. Types and forms of an evaluation. Requirements for an evidence-based evaluation. Quality of evaluation. Fitzpatrick, J. L. (2004). Program evaluation: alternative approaches and practical guidelines. Boston: Pearson Education. Greve, B. (2017). Handbook of social policy evaluation. Cheltenham; Northampton: Edward Elgar.

6 Pawson, R., Tilley, N. (2006). Realistic evaluation. London: SAGE Publications. Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach. Thousand Oaks, CA: Sage. Wholey, J. S., Hatry, H. P., Newcomer, K. E. (2010). Handbook of practical program evaluation. San Francisco: Jossey-Bass: John Wiley & Sons. [Available as PDF] 7.2. Behaviour Change Theories and Beyond Behaviour change: definitions, foundations, main theories. Domains of behaviour change in modern society. Behaviour change wheel. Behaviour spillover. The use of incentives and disincentives in behaviour change programmes and interventions. Planning behaviour change at different levels (theory, policy, practice). Evaluation of behaviour change programmes. Beyond behaviour change. Hierarchy of social thought and change in attitudes, beliefs, norms, identity. The role of values, norms, social representations, stereotypes and attitudes in behaviour change. Habit change and social practice change. Lifestyle change. Pawson, R. (2013). The science of evaluation: a realist manifesto. Los Angeles: SAGE Publications. Davis, R., Campbell, R., Hildon, Z., Hobbs, L., & Michie, S. (2015). Theories of behaviour and behaviour change across the social and behavioural sciences: a scoping review. Health psychology review, 9(3), Michie, S., van Stralen, M. M., & West, R. (2011). The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science : IS, 6, Michie, S., Wood, C. E., Johnston, M., Abraham, C., Francis, J. and Hardeman, W. (2015). Behaviour change techniques: the development and evaluation of a taxonomic method for reporting and describing behaviour change interventions (a suite of five studies involving consensus methods, randomised controlled trials and analysis of qualitative data). Health Technology Assessment, 19(99), doi: /hta19990 Truelove, H., Carrico, A., Weber, E., Raimi, K., & Vandenbergh, M. (2014). Positive and Negative Spillover of Pro-environmental Behavior: An Integrative Review and Theoretical Framework Global Environmental Change. Global Environmental Change /j.gloenvcha Nash, N., Whitmarsh, L., Capstick, S., Hargreaves, T., Poortinga, W., Thomas, G., Sautkina, E., Xenias, D. (2017). Climate-relevant behavioural spillover and the potential contribution of social practice theory. WIREs Climate Change. Kurz, T., Gardner, B., Verplanken, B. & Abraham, C. (2014). Habitual behaviors or pattern of practice? Explaining and changing repetitive climate-relevant actions. Wiley Interdisciplinary Reviews: Climate Change /wcc

7 Capstick, S. B., Lorenzoni, I., Corner, A., Whitmarsh, L. E. (2015). Prospects for radical emissions reduction through behavior and lifestyle change. Carbon Management 5 (4), pp / Melvin M. M., Donaldson, S. I., Campbell, B. (2011). Social Psychology and Evaluation. New York: Guilford Press Complexity in Evaluation Definitions of complexity. The role of context (social, health, environmental, educational, economical, political, etc) and levels (micro-meso-macro). Complex systems theory applied to evaluation research. Interventions as events in systems. Simple and complex social interventions. Evaluation of a complex social intervention; evaluation of a 'simple' intervention in a complex context. Realistic evaluation. Opportunities and pitfalls when considering complexity. Complexity and sustainability. Fitzpatrick, J. L. (2004). Program evaluation: alternative approaches and practical guidelines. Boston: Pearson Education. Patton, M. Q. ( 2011 ). Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York, NY, US: Guilford Press. Pawson, R., Tilley, N. (2006). Realistic evaluation. London: SAGE Publications. Pawson, R. (2013). The science of evaluation: a realist manifesto. Los Angeles: SAGE Publications. Shiell A, Hawe P, Gold L. (2008). Complex interventions or complex systems? Implications for health economic evaluation. BMJ;336: Hawe P, Shiell A, Riley T. (2009). Theorising Interventions as Events in Systems. Am J Community Psychol;43: Petticrew M, McKee M, Lock K, et al. (2013). In search of social equipoise. BMJ;347:f Sautkina E., Goodwin D., Jones A., Ogilvie D., Petticrew M., White M., Cummins S. (2014). Lost in translation? Theory, policy and practice in systems-based environmental approaches to obesity prevention in the Healthy Towns programme in England. Health Place;29:60 6. doi: /j.healthplace Moore, G. F., Audrey, S., Barker M., Bond L., Bonell C., Hardeman W. et al. (2015). Process evaluation of complex interventions: Medical Research Council guidance BMJ; 350 :h1258. doi: Social Psychology and Social Science Methods for Evaluation of Programmes and Interventions Where to start from when designing an evaluation study?

8 Types of evaluation research design. Experimental, correlational, qualitative, mixed methods research. Longitudinal studies. RCTs. Action research. Research methodology. Routine and primary data. Big data. Surveys, interviews, focus groups, ethnography, diaries. Systematic reviewing and meta-analysis. Case studies. Laboratory and real-life research. Devolution labs. Natural experiments. Breakwell, G. M. (2004). Doing social psychology research. Malden; Oxford: Blackwell. Breakwell, G. M., Smith, J. A., Wright, D. B. (2012). Research methods in psychology. London: SAGE Publications.# Sansone, C.,Morf, C. C., Panter, A. T. (2004). The Sage handbook of methods in social psychology. London; Thousand Oaks; New Delhi: SAGE Publications. Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches. Los Angeles: Sage. Greene, J. C. (2007). Mixed methods in social inquiry. San, Francisco, Ca: Jossey-Bass. Patton, M. Q. (2008). Utilization-focused evaluation, 4th ed. Los Angeles: SAGE Publications. Patton, M. Q. (2002). Qualitative research and evaluation methods. Thousand Oaks, Calif, Sage Publications. [HSE Full Text] Wholey, J. S., Hatry, H. P., Newcomer, K. E. (2010). Handbook of practical program evaluation. San Francisco: Jossey-Bass: John Wiley & Sons. [Available as PDF] Greve, B. (2017). Handbook of social policy evaluation. Cheltenham; Northampton: Edward Elgar. Khandker, S. R. (2010). Handbook on impact evaluation: quantitative methods and practices. Washington, D.C.: The World Bank. [Available as PDF] Fitzpatrick, J. L. (2004). Program evaluation: alternative approaches and practical guidelines. Boston: Pearson Education. Petticrew, M., Roberts, H. (2006). Systematic Reviews in the Social Sciences: A Practical Guide. Oxford: Blackwell. Sautkina, E., Bond, L., Kearns, A. (2012). Mixed Evidence on Mixed Tenure Effects: Findings From a Systematic Review of UK Studies, Housing Studies 27 (6): Boland, A., Cherry, M. G., Dickson, R. (2017). Doing a systematic review: a student's guide. Los Angeles: SAGE Publications Designing and Conducting an Evaluation Study Putting the elements of an evaluation research together. Aligning programme theory, logic model, type of evaluation, and methodology. Setting up for the study. Specificity of data collection and data analysis in evaluation.

9 Conducting an evaluation study: pilot studies; barriers and opportunities; the role of context; changes in the study design. Working with the data. Data collection and storage. Preliminary data analyses. Wholey, J. S., Hatry, H. P., Newcomer, K. E. (2010). Handbook of practical program evaluation. San Francisco: Jossey-Bass: John Wiley & Sons. [Available as PDF] Greve, B. (2017). Handbook of social policy evaluation. Cheltenham; Northampton: Edward Elgar. Fitzpatrick, J. L. (2004). Program evaluation: alternative approaches and practical guidelines. Boston: Pearson Education. Patton, M. Q. (2008). Utilization-focused evaluation, 4th ed. Los Angeles: SAGE Publications. O'Sullivan, R. G. (2004). Practicing evaluation: a collaborative approach. London; Thousand Oaks; New Delhi: SAGE Publications The Evaluation Loop The 'evaluation loop'. Partnerships between science, policy and practice. Identification of policy needs. Feeding new evidence back into policies and programmes. Policy learning and new policy design. Working in an evaluation team. Relations and communication within the evaluation loop. Economic evaluation. Cost-benefit analysis. Cost-effectiveness analysis. Evaluation ethics. Internal and external evaluations. Building the evaluation capacity in organisations. Perspectives of future development in the field of Evaluation. Torres, R., Preskill, H., & Piontek, M. E. (2005). Evaluation strategies for communicating and reporting, enhancing learning in organizations. (Second ed.). Sage. Greener, B. Greve. (2014). Evidence and evaluation in social policy. Chichester: Wiley-Blackwell. Pawson, R. (2013). The science of evaluation: a realist manifesto. Los Angeles: SAGE Publications. Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach. Thousand Oaks, CA: Sage. Wholey, J. S., Hatry, H. P., Newcomer, K. E. (2010). Handbook of practical program evaluation. San Francisco: Jossey-Bass: John Wiley & Sons. [Available as PDF] Greve, B. (2017). Handbook of social policy evaluation. Cheltenham; Northampton: Edward Elgar.

10 Khandker, S. R. (2010). Handbook on impact evaluation: quantitative methods and practices. Washington, D.C.: The World Bank. [Available as PDF] O'Sullivan, R. G. (2004). Practicing evaluation: a collaborative approach. London; Thousand Oaks; New Delhi: SAGE Publications. Fitzpatrick, J. L. (2004). Program evaluation: alternative approaches and practical guidelines. Boston: Pearson Education. 8. Technical Support for the Course In order to hold classes the professor needs a laptop, projector and speakers. Classrooms should allow work in small groups.