LCAP Evaluation workshop. Snigdha Mukherjee, PhD. Louisiana Public Health Institute October 28, 2010

Size: px
Start display at page:

Download "LCAP Evaluation workshop. Snigdha Mukherjee, PhD. Louisiana Public Health Institute October 28, 2010"

Transcription

1 LCAP Evaluation workshop Snigdha Mukherjee, PhD. Louisiana Public Health Institute October 28, 2010

2 Introductions

3 Agenda for the day Part I: Overview of Program Evaluation Part II: Types of Evaluation Part III: Develop an Evaluation Plan Part IV Data Collection Closing and feedback

4 Part I: Overview of Program Evaluation

5 What is Evaluation? The systematic collection of information about the activities, characteristics, and outcomes of a program to assess its strengths and weaknesses, improve program effectiveness and/or inform decisions about future programming. - Michael Quinn Patton In essence, to evaluate anything means to assess the merit or worth of something again criteria or standards

6 Monitoring & Evaluation Monitoring: What are we doing? Tracking inputs and outputs to assess whether programs are performing according to plans (e.g., people trained, condoms distributed) Evaluation: What have we achieved? Assessment of impact of the program on behavior or health outcome (e.g., condom use at last risky sex, HIV prevalence) Surveillance: monitoring disease Spread of HIV/STD (e.g., HIV prevalence among pregnant women)

7 Why are evaluations done? Understand and improve programs Tell a program s story Be accountable Inform the field Support fundraising efforts

8 Program Evaluation A typical program evaluation consists of the following activities: Posing questions about the program Setting standards of effectiveness Designing the evaluation and selecting participants Collecting data Analyzing data, and Reporting the results

9 The logic of evaluation Establishing criteria of merit Constructing standards Measuring performance Synthesizing and integrating data into a judgment of merit Rossi, Freeman & Lipsey, 2004 pp. 70

10 Stake holders & Audiences Stakeholders (those affected by the program) Consumers Service providers Families Community members Clinics Audiences (sees and/or uses evaluation results) Organization Directors State Department of health Funders Administrators Potential advocates Tax payers

11 Stakes or importance of a program Program cost programs that are expensive need to be proven effective, if not improved or abandoned Importance of outcomes some programs have serious implications for failure E.g. participants in CPR courses are tested for proficiency because outcomes may mean the difference between life & death Perceived importance of program/outcomes by stakeholders and audiences some programs are evaluated due to a request from stakeholders or audiences

12 Evaluation context What are the time & resource constraints? Are there any hidden or not so hidden political agendas? Has the program had the opportunity to be effective? Will the evaluation results be challenged? What is the historical context of the program? How would the context affect information collected?

13 Part II: Types of Evaluation

14 Designing the evaluation Consider the type of evaluation that fits your needs Types of evaluation Develop an evaluation calendar

15 Types of evaluation Formative Outcome Process

16 The Activities Types of Evaluation Formative evaluates a program and its process during development or formation Process evaluates the process fidelity, implementation of the program compared to the design The Effect Outcomes evaluates effectiveness in terms of programmatic outcomes Impact evaluates effect on community and oither institutes

17 HIV/AIDS Monitoring &Evaluation MONITORING Process Evaluation Inputs Pipeline Outputs EVALUATION Effectiveness Evaluation Number of Projects All Most Resources Staff Funds Materials Facilities Supplies Training Condom availability Trained staff Quality of services (e.g. STI, VCT, care) Knowledge of HIV transmission Outcomes Some Short-term and intermediate effects Behavior change Attitude change Changes in STI trends Increase in social support Impact Few Long-term effects Changes in : HIV/AIDS trends AIDS-related mortality Social norms Coping capacity in community Economic impact Levels of Evaluation Efforts

18 Formative Evaluation What, Why, When Helps to identify or clarify the needs the new program is meant to address Helps identify gaps in service Tests initial reaction to the program design Used to pre-test a design before full implementation

19 Formative evaluation questions Sample questions: - What is the most efficient way to recruit participants? - What types of program activities are desired? - What are the preferences of consumers? - What is the best way to deliver the program?

20 Process evaluation Used for program monitoring Assessment og how resources are used Answers questions about adherence to design of the program Can help identify early on problems in recruitment, program delivery, and staff issues

21 Process (monitoring) Evaluation Answers the questions -What, Why, When Looks at what activities, services or intervention is being implemented Accountability determine alignment with program s final design or purpose; for monitoring Program improvement mid-course corrections, changes in outreach, recruitment, or data collection Replication clarify the ingredients before replicating or taking to scale

22 Process evaluation questions Develop evaluation questions on: Reach, coverage - target population (characteristics, proportions served, outreach efforts) Dose, Duration services or intervention (what services, how often, by who, cost) Context other factors that influence how program gets implemented (neighborhood, additional services) Fidelity how well program adheres to plan Sample questions -Who is the intended target population of the program? - What elements of the program have actually been implemented - What barriers did clients experience in accessing the services? - Was the program delivered as planned? - What was the average cost per person who received the program?

23 Process Evaluation data Collect credible (quantifiable) evidence Client demographics age, race, gender, socioeconomic status Client s prior status or behavior previous alcohol abuse, exercise, condom usage, Client outreach method of contact, mode of transportation Staff demographics, training, turnover rate Program intervention number of training sessions, # condoms distributed, frequency and attendance of services

24 Outcome evaluation What, Why, When Measures the effect on clients, a population, or the community changes in knowledge, attitude or behavior Improve the service delivery of the program by focusing on key tasks Identify effective practices with in the program Usually conducted after the program has been implemented for enough time to anticipate results Used for decision-making (e.g., continue funding, increase funding, modify and try again, discontinue funding

25 Outcome evaluation Questions - Are participants more knowledgeable about the subject after their training? - Has there been a change in behavior (increase in condom use, decrease in risk behaviors) since the intervention began? - To what extent has the project met its objectives? - How effective has the project been at producing changes? - Are there any factors outside of the project that have contributed to (or prevented) the desired change? - Has the project resulted in any unintended change?

26 Impact Evaluation What, Why, When Measures the effect on clients, a population, or the community Changes in knowledge attitude or behavior or condition Very similar to outcomes evaluation Sample questions - What is the effect of the program on the long term condition of a group or population? - What is the collective effect of similar programs? - How have these programs affected the system of services related to this need?

27 A typical example of the time-scale for a two-year evaluation

28 Part III: Develop an Evaluation Plan

29 Stages in a program evaluation Stages in a program evaluation Stage I Planning the evaluation Project Manager, stakeholders and/or organization Stage II Carrying out the evaluation Evaluation team Stage III Following-up the evaluation Organization, Funders, stakeholders

30 PLANNING THE EVALUATION Financial resources Thinking through the focus of the evaluation Finding a suitable evaluation team & checking their suitability with stakeholders Organizing the logistics training, hiring additional staff etc. Briefing the evaluation team CARRYING OUT THE EVALUATION The evaluation teams take over and the gets the job done, but Project Manager keeps in touch Receiving the draft report USING THE EVALUATION Getting the management response Decisions and action based on report Publishing the report

31 Why develop an evaluation plan? It provides a road-map Guides evaluation activities Explains what, when, how, why, who Focuses the evaluation Documents the evaluation process for all stakeholders Ensures implementation fidelity

32 Developing of an Evaluation Plan

33 Step 1 Engage Stakeholders Who needs to be at the planning table? Who will use the evaluation information? What information do they want/need? What are their interests? How will you get these people to the planning table?

34 Step 2 Know your program well Background and description of program What problem does your program address? What are the causes and consequences of the problem? What is the magnitude of the problem? What changes or trends are occurring that affect the problem? Context What are the political, social or other factors that affect your program

35 Step 3 Define your intervention What is your strategy? What is your goal? A broad statement that describes what you hope to achieve through your efforts What are your objectives? Specific, measurable steps that can be taken to meet the goal

36 Step 4 Focus Your Evaluation Define key questions for your evaluation. What do you want to know? What do other stakeholders want to know? What is the purpose of your evaluation?

37 Step 4 Focus Your Evaluation What types of evaluation data do you need? Formative What do I need to get started? Process What information/activity do I want to track or monitor? Outcome What will happen as a result of the service or intervention?

38 Step 4 Focus Your Evaluation Define your outcomes: What changes do you expect to see? Short-term Change in knowledge, attitudes, beliefs Intermediate Change in behaviors Long-term Change in conditions, status

39 Step 4 Focus Your Evaluation What are your indicators? Observable evidence of accomplishments, changes made, or progress achieved Measurements that help you answer whether or not you have been successful in achieving your outcomes

40 Step 4 Focus Your Evaluation Things to consider when selecting Methods: 1) Indicators what methods will best measure your outcomes? 2) Target population - what methods are most appropriate/feasible? 3) Resources How much money do we have for data collection?

41 Targets and Benchmarks Target A numerical objective for a program s level of achievement on an indicator a projection Benchmark Performance data used for comparison past agency data or industry standard

42 Targets and Benchmark example Outcome Indicator Target Benchmark Initial Parents learn what their children are capable of doing # number of parents that demonstrate increased knowledge of child development through pre-post test on 5 key issues after attending workshops 200 Generally we see parents each year. We belive that 90% of parents will show some improvement on the pre-post test. Last year 95% of parents showed increased knowledge

43 Family Health International - FHI

44

45

46 Goals & Objectives Goals & objectives provide us the basis to judge whether a program is a success or failure Goals: ultimate results of the project, sometimes unreachable in short term Goal represents a general, big-picture statement of desired results. Usually written in broad-based statements Objectives: measurable, time-specific results that you expect to accomplish Based on realistic expectations and more narrowly defined than goals

47 SMART Goals and Objectives S Specific M Measurable A Attainable or Achievable R Relevant T Time-bound

48 Goals When writing goals answer the following: What impact does the problem have? Does it have national, regional and/or local significance? What are the benefits to the community if the problem is solved? What will happen if the problem is not solved?

49 Objectives When writing objectives, address the following: State in quantifiable terms Describe the outcomes Include only one result per objective Clearly identify the group benefiting from the planned project Include statements that are realistic and capable of being accomplished Do not describe methods

50 Objectives Describe specific outcomes that, if achieved enable one to see that the objective has been achieved A statement of the specific behaviors and accomplishments Specification of the success criteria for these behaviors

51 Writing useful objectives Four techniques helpful for writing useful objectives Using strong verbs: describe an observable or measurable behavior eg. To increase the use of educational materials Stating only one purpose or aim: Most programs will have multiple objectives but within each objective only one aim should be stated eg. start three prenatal classes for pregnant women Specifying a single end-product or result: eg. To establish a subcontract with the City Hospital (a lower order objective) Specifying the expected time for achievement :eg for the above start a specific time frame sometime between March 1 st to May 30 th

52 Examples of objective-setting Objective Operational indicator Provide information on hypertension A 25% improvement in post-test treatment to improve physicians versus pretest exam scores. knowledge of treatment of hypertension. Ninety percent of all treated pat- ients should have blood pressures within age-adjusted normal levels within 12 months Increase the percentage of recommended physician practices in the treatment of hypertension. Lower the age-adjusted blood pressure of patients with hypertension. A 30% increase in the percentage of recommended physician pract- ices performed from postprogram versus preprogram.

53 Programs and Evaluations A program is a theory and an evaluation is its test. In order to organize the evaluation to provide a responsible test, the evaluator needs to understand the theoretical premises on which the program is based (p. 55). Carol Weiss (1998)

54 Program theory Program theory - program theory describes what the intended intervention is expected to do and an explanation of the underlying rationale (whether explicit or implicit) for achieving the expected results A cause-effect relationship a theory-based approach allows a much more in-depth understanding of the workings of a program or activity the program theory. Logic models flow from program theory

55 Logic Models A logic model is a diagram and text that describes/illustrates the logical (causal) relationships among program elements and the problem to be solved, thus defining measurements of success. We use these resources For these activities To produce these outputs So that these clients can change their ways Which leads to these outcomes Leading to these results! Your planned work Your intended results

56 Components of Logic Model Resources human, financial, organizational, community for a program (also known as Inputs) Program activities processes, tools, events, technology, and actions Outputs direct products of the activities; may include types, levels or targets of services delivered by program Outcomes specific changes in program participants knowledge, skills, behaviors, functioning Short term goals may be achieved between 1-3 years; long-term 4 to 6 years Impact fundamental intended or unintended chages occurring in organizations, communities or systems as a result of activities in a span of 7 to 10 years

57 Clarification - Outputs vs. outcomes Example: Number of patients discharged from state mental hospital is an output. Percentage of discharged who are capable of living independently is an outcome Not how many worms the bird feeds its young, but how well the fledgling flies (United Way of America, 1999)

58 Why create logic models? Logic model development offers the following benefits * : Clearly identify program goals, objectives, activities, and desired results Clarify assumptions and relationships between program efforts and expected outcomes Communicate key elements of the program Help specify what to measure in an evaluation Guide assessment of underlying project assumptions and promotes self-correction *

59 If-then relationships Underlying the logic model is a series of if-then relationships that express the program s theory of change IF then IF then IF then IF then IF then University of Wisconsin-Extension, Program Development and Evaluation

60 Logical chain of connections showing what the program is to accomplish INPUTS OUTPUTS OUTCOMES Program investments What we invest Activities Participation Short Medium What we do Who we reach What results Longterm University of Wisconsin-Extension, Program Development and Evaluation

61 Example of a Training Model Inputs Activities Outputs Outcomes Impacts Resources Services Products Benefits Changes Money Staff Volunteers Supplies Space Training Education Counseling Total # of classes Hour of service Number of participants completing course New knowledge Increased skills Changed attitudes New employment opportunities Trainees earn more over five years than those not receiving training Trainees have a higher standard of living than the control

62 Components of a HIV programme / Project Program level Population level Inputs Processes Outputs Outcomes Resources Staff Drugs, FP Supplies Equipment Functions, Activities Training Logistics Services % of facilities offering service % of communities with outreach # of trained staff Utilization: # of new clients # of return clients Short-term Contraceptive use # of HIV+ on ART Long-term Infection rate Mortality Fertility

63 How will activities lead to desired outcomes? A series of if-then relationships Tutoring Program Example IF then IF then IF then IF then IF then We invest time and money We can provide tutoring 3 hrs/week for 1 school year to 50 children Students struggling academically can be tutored They will learn and improve their skills They will get better grades They will move to next grade level on time 64 University of Wisconsin-Extension, Program Development and Evaluation

64 Basic Logic Model Outcomes and Impacts should be SMART: Specific Measurable Action-oriented Realistic Timed

65 Example Program Outputs Outcomes Crime control Highway construction Hrs of patrol # responses to calls # crimes investigated Arrests made Project designs Highway miles constructed Highway miles reconstructed From Poister, 2003 Reduction in crimes committed Reduction in deaths and injuries resulting from crime; Less property damaged or lost due to crime Capacity increases Improved traffic flow Reduced travel times Reduction in accidents and injuries 66 University of Wisconsin-Extension, Program Development and Evaluation

66 Part IV: Data Collection

67 Data collection Methods Collect Program or policy Evaluation Questions Design Approaches

68 Data collection Establish a clear timeline for data collection Be clear about your sampling strategy Everyone vs. subsample Think about the pros and cons in beginning Develop clear protocols Train staff Periodically monitor quality of data Protect confidentiality of participants

69 Indicators Indicators are the specific information that track a programs success. It is how you know that something has changed. Effective indicators Measureable Meaningful Manageable Clear As unbiased as possible Sensitive to change Acceptable to stakeholders

70 Outcome indicators - Example Outcome Initial Parents learn what their children are capable of doing Intermediate Parents participate in their child s education Long term Children are ready for school Indicator # of parents that demonstrate increased knowledge of child development through pre-post test on 5 key issues after attending workshops # of parents that attend one school-based event in addition to PTA conferences # of children that are developmentally ready based on standardized child development assessment tool

71 Steps to writing a good indicator Identify exactly hope to benefit (WHO?) Identify specific, observable change or accomplishment (WHAT?) Determine when the outcome is expected to occur (BY WHEN?) Indicator example WHO ---- # parents WHAT --- demonstrate increased knowledge of child development BY WHEN ---- after attending the workshop

72 The Evaluator s dilemma "Not everything that counts can be counted and not everything that can be counted, counts." - Albert Einstein

73 Generating Evaluation questions Meaningful questions are the heart of evaluation Qualities of good questions include: Relevant to the purpose of the evaluation and program goals so that they are useful for important decisions; Important to the identified audience(s); Comprehensive enough to provide adequate information about what is being evaluated; Constructed in ways that information is balanced and not bias; Answerable with realistic means and at a reasonable cost.

74 Three kinds of questions Descriptive Seek to describe the program or process Answer questions, such as: who? what? where? when? how? and how much/many? Normative Compare the current situation against a specified target, goal or benchmark. There is a standard or criterion against which to compare achieved performance. look at what is and compare it to what should be Cause-Effect - Cause-effect questions seek to determine the effects of a program Is it the intervention that has caused the results or something else?

75 Data sources & Data collection Data sources Who or where will I get the information from? Data collection method What is the tool or method for collecting the data? How is the tool administered? How often is information collected

76 Data for Evaluation Can be produced using any of the standard social-science research techniques Surveys Focus groups In-depth interviews Client-provider observation Key informants Surveillance Plus, analysis of program data (administrative, financial, service statistics)

77 Data sources pros & cons Data Source Example Advantages Disadvantages Program records (yours or others) Report cards, certificates, referrals Available, accessible Value of data dependents on how carefully it is recorded (completeness, updated etc) Specific individuals or trained observers Teachers reports on student behavior, case manager, client Provides first hand account Can be biased by interpretation or perceived pressure Mechanical measures Blood test, scale, Relatively objective Findings affected by accuracy of device

78 General rules for data collection Use available data if you can If using available data be sure to find out how earlier evaluators Collected the data Defined the variables Ensured accuracy of data If you much collect original data Establish procedures & follow them Maintain accurate records of definitions & coding Pre-test, pre-test, pre-test Verify accuracy of coding, data input

79 Key issues about measurements Are the measures credible? Trustworthy or believable data collection Are the measures valid? Are the questions asked giving you information about the issues you want to know about? Are the measures measuring what counts? As opposed to what is easiest Are the measures reliable? Stability of measures - that it measures the same thing the same way in repeated tests Are the measures precise? How the language in data collection matches the measure

80 Quantitative & qualitative data Quantitative approach More structured Attempts to provide precise measures Reliable easier to analyze Qualitative approach Less structured Provides rich data Challenge to analyze Labor intensive to collect Usually generates longer reports

81 When to use quantitative vs. qualitative approaches If you: Want to do statistical analysis Then use this approach: Quantitative Want to be precise Know exactly what you want to measure Want to cover a large group Want anecdotes or in-depth information Qualitative Are not sure what you want to measure Do not need to quantify

82 Data Collection Methods The Data Collection Method Provide a description of the process for collecting the information Example A Annual review of program records of referral sent for housing subsidy Example B Caseworkers rate the family each month during home visit Example C Tool = Self-administered Questionnaire Distribution = sent via mail with stamped return envelope Frequency = sent 90 days after completion of program

83 Data Collection Example Outcome Indicator Target Data source Data collection methods Initial Parents increase knowledge # parents that demonstrate increased knowledge of child development through pre-post test on 5 key issues after attending workshops 200 The parents who participate in at least 2 sessions Written or online survey that is distributed at 1 st class and again at last class. Parents who did not complete both tests are not included in final results

84

85

86 Human Subjects Protection Human Subjects protections Vulnerable populations Confidentiality HIPAA Data Security Data Storage

87 Closing thoughts The only man who behaves sensibly is my tailor; he takes my measurements anew every time he sees me, while all the rest go on with their old measurements and expect me to fit them - George Bernard Shaw

88 Questions?

Resource Library Banque de ressources

Resource Library Banque de ressources Resource Library Banque de ressources SAMPLE POLICY: PROGRAM PLANNING AND EVALUATION MANUAL Sample Community Health Centre Keywords: program planning, evaluation, planning, program evaluation Policy See

More information

Monitoring and Evaluation: the Foundations for Results

Monitoring and Evaluation: the Foundations for Results Monitoring and Evaluation: the Foundations for Results Laura B. Rawlings Lead Social Protection Specialist Human Development Network World Bank Beijing, China Impact Evaluation Workshop July 2009 1 Objectives

More information

Use of Logic Models and Theory of Change Models for Planning and Evaluation

Use of Logic Models and Theory of Change Models for Planning and Evaluation Use of Logic Models and Theory of Change Models for Planning and Evaluation Developed and Facilitated by: Cynthia Berry, Ph.D. Berry Organizational & Leadership Development (BOLD), LLC. Cynberry42@msn.com

More information

Guidance: Quality Criteria for Evaluation Reports

Guidance: Quality Criteria for Evaluation Reports Evaluation Guidance Note Series No.8 UNIFEM Evaluation Unit October 2009 Guidance: Quality Criteria for Evaluation Reports 1 1. Introduction The UNIFEM quality criteria for reports are intended to serve

More information

Measuring the Impact of Volunteering Alan Witchey Volunteer Center Director United Way of Central Indiana

Measuring the Impact of Volunteering Alan Witchey Volunteer Center Director United Way of Central Indiana 1 Measuring the Impact of Volunteering Alan Witchey Volunteer Center Director United Way of Central Indiana October 4, 2013 Before you start some work, always ask yourself three questions - Why am I doing

More information

Outcome Based Evaluation: Planning, Process & Implementation

Outcome Based Evaluation: Planning, Process & Implementation Outcome Based Evaluation: Planning, Process & Implementation Joan Othieno M.A., PhD Research & Evaluation Advisor National Capacity Building Project March 29 th, 2011 Overview Program Evaluation Definition,

More information

Safety Perception / Cultural Surveys

Safety Perception / Cultural Surveys Safety Perception / Cultural Surveys believes in incorporating safety, health, environmental and system management principles that address total integration, thus ensuring continuous improvement, equal

More information

A Solution-Focused Approach to Child Welfare Assessments

A Solution-Focused Approach to Child Welfare Assessments A Solution-Focused Approach to Child Welfare Assessments Introduction Assessments of child welfare systems may be structured in many ways and investigate a variety of issues. Different from formal program

More information

The Basics of Program Evaluation

The Basics of Program Evaluation The Basics of Program Mark Edgar PhD, MPH Assistant Professor of Public Health Policy University of Illinois at Springfield Synonymous terms Program evaluation research Definition the use of social research

More information

Glossary of Performance Measurement Terms Ontario Public Service Reference Document for Provincial Ministries

Glossary of Performance Measurement Terms Ontario Public Service Reference Document for Provincial Ministries Glossary of Performance Measurement Terms Ontario Public Service Reference Document for Provincial Ministries Introduction The following definitions are based on a number of sources and have been tailored

More information

Cross-Program Evaluation Frameworks, Outcomes, and Reporting Training, Technical Assistance and Capacity Building

Cross-Program Evaluation Frameworks, Outcomes, and Reporting Training, Technical Assistance and Capacity Building Cross-Program Evaluation Frameworks, Outcomes, and Reporting Training, Technical Assistance and Capacity Building Regional Workgroup Meetings Outline Why evaluate across programs? Evaluation framework

More information

Performance Measurement Toolkit

Performance Measurement Toolkit Corporation for National and Community Service AmeriCorps Program Applicant Performance Measurement Toolkit Project STAR 1-800-548-3656 star@aiweb.com www.projectstar.org This toolkit is intended to help

More information

Evaluation Policy for GEF Funded Projects

Evaluation Policy for GEF Funded Projects Evaluation Policy for GEF Funded Projects Context Conservation International helps society adopt the conservation of nature as the foundation of development. We do this to measurably improve and sustain

More information

What is impact evaluation, when and how should we use it, and how to go about it?

What is impact evaluation, when and how should we use it, and how to go about it? What is impact evaluation, when and how should we use it, and how to go about it? ADB December 17, 2009 International Initiative for Impact Evaluation What is impact? Impact = the outcome with the intervention

More information

Minnesota Literacy Council AmeriCorps VISTA. New Project Application

Minnesota Literacy Council AmeriCorps VISTA. New Project Application Minnesota Literacy Council AmeriCorps VISTA New Project Application 2017-18 The online application form is located at www.tfaforms.com/452761. You may reference the content of the online form in this document,

More information

Practice Transformation Academy Kick-off Meeting

Practice Transformation Academy Kick-off Meeting Practice Transformation Academy Kick-off Meeting The Care Transitions Network National Council for Behavioral Health Montefiore Medical Center Northwell Health New York State Office of Mental Health Netsmart

More information

Areas of Responsibilities, Competencies, and Sub-competencies for Health Education Specialists

Areas of Responsibilities, Competencies, and Sub-competencies for Health Education Specialists Areas of Responsibilities, Competencies, and Sub-competencies for Health Education Specialists - 2015 All rights reserved. No part of this document may be reproduced, stored in retrieval system or transmitted

More information

The Seven Areas of Responsibility of Health Educators Area of Responsibility I: ASSESS NEEDS, ASSETS AND CAPACITY FOR HEALTH EDUCATION COMPETENCY

The Seven Areas of Responsibility of Health Educators Area of Responsibility I: ASSESS NEEDS, ASSETS AND CAPACITY FOR HEALTH EDUCATION COMPETENCY The Seven Areas of Responsibility of Health Educators Area of Responsibility I: ASSESS NEEDS, ASSETS AND CAPACITY FOR HEALTH EDUCATION COMPETENCY 1.1. Plan Assessment Process 1.1.1 Identify existing and

More information

To measure achievement of objectives and to document the impact of interventions.

To measure achievement of objectives and to document the impact of interventions. Module I Evaluation Objective To measure achievement of objectives and to document the impact of interventions. A systematic evaluation of a program or project objectively measures change what it has accomplished

More information

EMT Associates, Inc. Approach to Conducting Evaluation Projects

EMT Associates, Inc. Approach to Conducting Evaluation Projects EMT Associates, Inc. Approach to Conducting Evaluation Projects EMT has been a leading small business in the evaluation field for over 30 years. In that time, we have developed an expertise in conducting

More information

At This Education Nonprofit, A Is for Analytics Social services agencies are turning to data to find the practices that get the best results.

At This Education Nonprofit, A Is for Analytics Social services agencies are turning to data to find the practices that get the best results. At This Education Nonprofit, A Is for Analytics Social services agencies are turning to data to find the practices that get the best results. Big Idea: Data & Analytics Interview June 30, 2015 Reading

More information

Watson-Glaser II Critical Thinking Appraisal. Development Report. John Sample COMPANY/ORGANIZATION NAME. March 31, 2009.

Watson-Glaser II Critical Thinking Appraisal. Development Report. John Sample COMPANY/ORGANIZATION NAME. March 31, 2009. Watson-Glaser II TM Critical Thinking Appraisal Development Report John Sample COMPANY/ORGANIZATION NAME March 31, 2009 Form (D,E) How to Use Your Report Success in the 21st century workplace demands critical

More information

REQUEST FOR PROPOSALS

REQUEST FOR PROPOSALS 1.0 RIGHT TO PLAY Right To Play is an international humanitarian organization that uses the transformative power of sport and play to promote the holistic development of children and youth in the most

More information

Refer to Chapter 4, Subsection for information on Guidance/Counseling during service delivery.

Refer to Chapter 4, Subsection for information on Guidance/Counseling during service delivery. LRS Part 408 CHAPTER 4, TECHNICAL ASSISTANCE & GUIDANCE MANUAL Name Effective Date Vocational Guidance and Counseling for Career Planning October 26, 2010 Authorization *Federal Register, Volume 66, Department

More information

Indicators, Targets and Data Sources

Indicators, Targets and Data Sources Indicators, Targets and Data Sources At the end of the session, participants will be able to: Construct Establish Describe Agree SMART Indicators Targets Good quality Data Sources on next steps 2 Results

More information

Field Guide to Consulting and Organizational Development

Field Guide to Consulting and Organizational Development Action Planning, Alignment and Integration (consulting phase 4)...302 Action plans... Also see Project planning aligning...316 contents of...310 developing...310 developing timelines in...313 establishing

More information

Module 4 Deciding to Evaluate: Front-End Analysis

Module 4 Deciding to Evaluate: Front-End Analysis Module 4 Deciding to Evaluate: Front-End Analysis Module Objectives By the end of this module you will be able to: 1 2 3 4 5 6 Identify whether an evaluation may be appropriate for a particular program

More information

Table of Contents Page

Table of Contents Page Logic Model & Evaluation Training Materials : Table of Contents Page Introduction Definition of Evaluation...1 Proving vs. Improving: A Brief History of Evaluation...1 Evaluation Principles...1 Why Evaluate?...3

More information

Standards for Social Work Practice with Groups, Second Edition

Standards for Social Work Practice with Groups, Second Edition Social Work with Groups ISSN: 0160-9513 (Print) 1540-9481 (Online) Journal homepage: http://www.tandfonline.com/loi/wswg20 Standards for Social Work Practice with Groups, Second Edition Association for

More information

Performance Measurement and Social Indicators

Performance Measurement and Social Indicators Performance Measurement and Social Indicators Maria P. Aristigueta Professor and Director School of Public Policy and Administration Measurement Models as impetus for accountability to the funding source

More information

EEF Guidance on Cost Evaluation April 2015

EEF Guidance on Cost Evaluation April 2015 EEF Guidance on Cost Evaluation April 2015 CONTENTS 1. Introduction...1 2. Planning for cost evaluation...1 3. Basic principles of cost evaluation...1 4. Different types of cost data...2 5. Data collection

More information

Advocacy. Self-Assessment Checklist: The Code identifies two key principles on advocacy: Self-Assessment Checklist - Advocacy.

Advocacy. Self-Assessment Checklist: The Code identifies two key principles on advocacy: Self-Assessment Checklist - Advocacy. Self-Assessment Checklist: Advocacy The Code of Good Practice for NGOs Responding to HIV/AIDS (the Code ) defines advocacy as a method and a process of influencing decision-makers and public perceptions

More information

CFPM RECRUITMENT AND SELECTION BEST PRACTICES (w/tip Sheets)

CFPM RECRUITMENT AND SELECTION BEST PRACTICES (w/tip Sheets) CFPM RECRUITMENT AND SELECTION BEST PRACTICES (w/tip Sheets) Because the Practice Model is a system-wide intervention, existing staff are trained, coached, and supported in its use. However there are key

More information

EMPLOYEE PERFORMANCE REVIEW GUIDELINES

EMPLOYEE PERFORMANCE REVIEW GUIDELINES EMPLOYEE PERFORMANCE REVIEW GUIDELINES DEPARTMENT OF HUMAN RESOURCES SPRING 2009 Employee Performance Review Guidelines The performance review process helps individual employees and organizations throughout

More information

Evaluating your educational initiative

Evaluating your educational initiative Evaluating your educational initiative Murari Suvedi Michigan State University Extension June 2002 Realities of Today Fewer Dollars Increased competition Greater expectations for effectiveness Increasing

More information

Crowe Critical Appraisal Tool (CCAT) User Guide

Crowe Critical Appraisal Tool (CCAT) User Guide Crowe Critical Appraisal Tool (CCAT) User Guide Version 1.4 (19 November 2013) Use with the CCAT Form version 1.4 only Michael Crowe, PhD michael.crowe@my.jcu.edu.au This work is licensed under the Creative

More information

Helen M. Yerger Special Recognition Award Requirements, Guidelines, and Hints

Helen M. Yerger Special Recognition Award Requirements, Guidelines, and Hints Helen M. Yerger Special Recognition Award ------------------------------------------------------ Requirements, Guidelines, and Hints www.hfma.org/awards/yerger/ HFMA Chapter Relations, yerger@hfma.org,

More information

PLUMAS RURAL SERVICES Serving people, Strengthening families, Building communities

PLUMAS RURAL SERVICES Serving people, Strengthening families, Building communities PLUMAS RURAL SERVICES Serving people, Strengthening families, Building communities Job Description Title: Hours: Wage Range: Supervisor: Exempt Status: Housing Coordinator Up to 40 hours per week $16.41

More information

Evaluation Toolkit. Integrate evaluation into your financial education program with NEFE s easyto-use National Endowment for Financial Education

Evaluation Toolkit. Integrate evaluation into your financial education program with NEFE s easyto-use National Endowment for Financial Education The Evaluation Toolkit Integrate evaluation into your financial education program with NEFE s easyto-use Evaluation Toolkit. 2014 National Endowment for Financial Education CONTENTS Background...1 The

More information

Core Knowledge. for Afterschool and Youth Development Professionals. Self-Assessment Tool LEVEL 3

Core Knowledge. for Afterschool and Youth Development Professionals. Self-Assessment Tool LEVEL 3 Core Knowledge andcompetencies for Afterschool and Youth Development Professionals Self-Assessment Tool LEVEL 3 The Vision& Mission ofnaa NAA is the only national membership organization for professionals

More information

Competencies Checklist for CE. Tier 1 Core Public Health Competencies Checklist

Competencies Checklist for CE. Tier 1 Core Public Health Competencies Checklist Competencies Checklist for CE Student Name: Area of Concentration: Project Title: Tier 1 Core Public Health Competencies Checklist Domain #1: Analytic/Assessment Skills Describes factors affecting the

More information

Version: 4/27/16 APPENDIX C RBA GUIDE

Version: 4/27/16 APPENDIX C RBA GUIDE Version: 4/27/16 APPENDIX C RBA GUIDE Common Metrics Results-Based Accountability Guide CTSA Program Version I. Introduction 1 What is Results-Based Accountability? Results-Based Accountability ( RBA )

More information

Contents. Shameless Promotion of Aspen OD s 360 Degree Feedback Assessments sorry, we have to do it :-) 3

Contents. Shameless Promotion of Aspen OD s 360 Degree Feedback Assessments sorry, we have to do it :-) 3 Contents Welcome from The Aspen OD Team 2 Shameless Promotion of Aspen OD s 360 Degree Feedback Assessments sorry, we have to do it :-) 3 The WHOs of 360 Degree Feedback Assessments 4 The WHATs of 360

More information

ALTE Quality Assurance Checklists. Unit 1. Test Construction

ALTE Quality Assurance Checklists. Unit 1. Test Construction ALTE Quality Assurance Checklists Unit 1 Test Construction Name(s) of people completing this checklist: Which examination are the checklists being completed for? At which ALTE Level is the examination

More information

The benefits of being organised and ready for volunteers are plentiful:

The benefits of being organised and ready for volunteers are plentiful: Be Prepared! Just like another well-known volunteer organisation with that slogan, your organisation needs to be prepared. It needs to know why it wants to involve volunteers in its work, how volunteer

More information

AUDIT Where are we now? ONGOING MEASUREMENT Are we getting there?

AUDIT Where are we now? ONGOING MEASUREMENT Are we getting there? CIPR Skill Guide Internal Communications: Measurement & Evaluation Applying the PRE Cycle to Internal Communications Introduction Measurement & evaluation is at the heart of successful, strategic internal

More information

Mentoring Toolkit Additional Resources

Mentoring Toolkit Additional Resources Mentoring Toolkit Additional Resources University of Edinburgh Mentoring Connections Programme Table of Contents Mentoring Connections at the University of Edinburgh... 4 General information on the mentoring

More information

MEETINGS (EDUCATION) TITLE SECTION PAGE BOOK MEETINGS 6.1 CONFIRM MEETING MEETING PREPARATION 6.3 ATTEND MEETING 6.4 MEETING QUESTIONNAIRE 2-3

MEETINGS (EDUCATION) TITLE SECTION PAGE BOOK MEETINGS 6.1 CONFIRM MEETING MEETING PREPARATION 6.3 ATTEND MEETING 6.4 MEETING QUESTIONNAIRE 2-3 MEETINGS (EDUCATION) TITLE SECTION PAGE BOOK MEETINGS CONFIRM MEETING MEETING PREPARATION ATTEND MEETING MEETING QUESTIONNAIRE PRESENTATION FORMAT FREE DEMONSTRATION CLOSE SALE / MEETING OVERCOMING OBJECTIONS

More information

EN T. How Clear is Your Talent Strategy?

EN T. How Clear is Your Talent Strategy? TA T How Clear is Your Talent Strategy? TA How Clear is Your Talent Strategy? by Marc Effron, President, The Talent Strategy Group & Anne Gotte, VP, Talent and Organizational Development, Ecolab As current

More information

Muslim Women in Sport. Increasing the diversity of women participating in sport from BAME and faith backgrounds

Muslim Women in Sport. Increasing the diversity of women participating in sport from BAME and faith backgrounds Muslim Women in Sport Increasing the diversity of women participating in sport from BAME and faith backgrounds Increasing participation and positive perceptions of sport within the female Muslim community

More information

Let s get started with the module Essential Data Steps: A Self Assessment.

Let s get started with the module Essential Data Steps: A Self Assessment. Welcome to Data Academy. Data Academy is a series of online training modules to help Ryan White Grantees be more proficient in collecting, storing, and sharing their data. Let s get started with the module

More information

THE MIDAS TOUCH INTRODUCTION... 3 QUESTION- BY- QUESTION GUIDE... 5 HOW TO PREPARE A WORK SAMPLE... 13

THE MIDAS TOUCH INTRODUCTION... 3 QUESTION- BY- QUESTION GUIDE... 5 HOW TO PREPARE A WORK SAMPLE... 13 THE MIDAS TOUCH HOW TO PREPARE A GOLD QUILL AWARDS ENTRY FOR THE COMMUNICATION SKILLS DIVISION TABLE OF CONTENTS INTRODUCTION... 3 QUESTION- BY- QUESTION GUIDE... 5 QUESTION 1: DESCRIBE THE ORGANIZATION...

More information

1 This document was adapted from information from Center for Public Health Quality, Charlotte Area Health Education Center, NC State University

1 This document was adapted from information from Center for Public Health Quality, Charlotte Area Health Education Center, NC State University 1 STEP BY STEP GUIDE TO IMPLEMENT QUALITY IMPROVEMENT SELECT A QI PROJECT Choosing the right project is important. If the project is the first for your organization it is important to choose one that will

More information

A Guide to Competencies and Behavior Based Interviewing

A Guide to Competencies and Behavior Based Interviewing A Guide to Competencies and Behavior Based Interviewing 9.14.2015 HR Toolkit http://www.unitedwayofcolliercounty.org/maphr 2015 Competence is the ability of an individual to do a job properly. Job competencies

More information

Role Title: Chief Officer Responsible to: CCG chairs - one employing CCG Job purpose/ Main Responsibilities

Role Title: Chief Officer Responsible to: CCG chairs - one employing CCG Job purpose/ Main Responsibilities Role Title: Chief Officer Responsible to: CCG chairs - one employing CCG Job purpose/ Main Responsibilities Accountable to: All employed staff working within the 3 CCGs Within the 3 CCGs the Chief Officer

More information

Creating a Job Search Program In Your Church, Synagogue Or Community Organization

Creating a Job Search Program In Your Church, Synagogue Or Community Organization Creating a Job Search Program In Your Church, Synagogue Or Community Organization Special Supplement to The Unwritten Rules of the Highly Effective Job Search By Orville Pierson Note: This Special Supplement

More information

Design, Monitoring, & Evaluation Module

Design, Monitoring, & Evaluation Module CASP: THE COMMON APPROACH TO SPONSORSHIP-FUNDED PROGRAMMING Design, Monitoring, & Evaluation Module {Revised April 2007} This module presents the framework and tools for designing, monitoring and evaluating

More information

Goal Training Handouts and Worksheets

Goal Training Handouts and Worksheets Goal Training Handouts and Worksheets The Goal Training Project is jointly funded by the Lifetime Care and Support Authority, the Motor Accidents Authority and WorkCover NSW, of the NSW Government s Safety,

More information

Developing a Monitoring and Evaluation Work Plan

Developing a Monitoring and Evaluation Work Plan CORE MODULE 3: Developing a Monitoring and Evaluation Work Plan Monitoring HIV/AIDS Programs A FACILITATOR S TRAINING GUIDE A USAID RESOURCE FOR PREVENTION, CARE AND TREATMENT Monitoring HIV/AIDS Programs:

More information

CI-GEF PROJECT AGENCY MONITORING AND EVALUATION POLICY FOR GEF-FUNDED PROJECTS

CI-GEF PROJECT AGENCY MONITORING AND EVALUATION POLICY FOR GEF-FUNDED PROJECTS CI-GEF PROJECT AGENCY MONITORING AND EVALUATION POLICY FOR GEF-FUNDED PROJECTS Version 02 March 2016 1 DOCUMENT LOG AND CHANGE RECORD Version Date Changes/Comments Author(s) 01 Oct 2013 Version submitted

More information

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS NUMBER 3 2 ND EDITION, 2010 PERFORMANCE MONITORING & EVALUATION TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues

More information

This position is in the Joint Office of Strategic Planning. This position is for the Medical School Campus.

This position is in the Joint Office of Strategic Planning. This position is for the Medical School Campus. Planning Associate - Joint Office of Strategic Planning 33673 Washington University School of Medicine in St. Louis MO This position is full-time and works approximately 40 hours per week. Department Name/Job

More information

FAQ: Various Research Methods

FAQ: Various Research Methods Question 1: What is exploratory research, and what are some of its common uses? Answer 1: Exploratory research is used to gain background information about the issues, clarify the problem, or suggest specific

More information

Evidence, policy and practice: a long and winding road

Evidence, policy and practice: a long and winding road Evidence, policy and practice: a long and winding road CRAWFORD SCHOOL SEMINAR 1 6 NOVEMBER 2 010 EMMA WILLIAMS Overview Interested in deconstructing paths between evidence, policy and practice Using NTER

More information

HRM. Human Resource Management Rapid Assessment Tool. A Guide for Strengthening HRM Systems. for Health Organizations. 2nd edition

HRM. Human Resource Management Rapid Assessment Tool. A Guide for Strengthening HRM Systems. for Health Organizations. 2nd edition HRM Human Resource Management Rapid Assessment Tool for Health Organizations A Guide for Strengthening HRM Systems 2nd edition Copyright 2005, 2009 Management Sciences for Health, Inc. All rights reserved.

More information

Terms of Reference (ToR)

Terms of Reference (ToR) RRC Evaluation Tool Basket: Terms of Reference 1 Terms of Reference (ToR) What is a ToR? A ToR document is a road map that outlines what needs to be done when and by whom. A ToR helps define your group

More information

Short Answer Question HELP GUIDE. Fellowship Programs Application

Short Answer Question HELP GUIDE. Fellowship Programs Application Short Answer Question HELP GUIDE Fellowship Programs Application Submission Policies 1. Echoing Green will only accept applications submitted through our website. You may not mail, fax, or email your application.

More information

Catholic Charities Performance and Quality Improvement (PQI) Plan

Catholic Charities Performance and Quality Improvement (PQI) Plan Catholic Charities Performance and Quality Improvement (PQI) Plan I. Introduction What is Performance and Quality Improvement? Performance: Refers to the numerical results information obtained from processes,

More information

HPOG. Employer Engagement

HPOG. Employer Engagement HPOG Employer Engagement The Dream Pitch The Dream The Dream Partnership Overview How and why you want to connect early with healthcare employers to engage them in your programs How to find employers (specifically

More information

Nationally Quality Assured by Child Care Aware

Nationally Quality Assured by Child Care Aware Child Care Center Planning Checklist This document was adapted from the Child Care Center Facility Development Checklists created by Building Child Care Project, a California statewide collaborative designed

More information

BIG BROTHERS BIG SISTERS OF FINNEY & KEARNY COUNTIES

BIG BROTHERS BIG SISTERS OF FINNEY & KEARNY COUNTIES Dear Applicant, Thank you for your interest in the School-Based Mentoring program. Enclosed in the packet are forms to be filled out and returned. In addition, the information stapled to the letter explains

More information

National Standards. Council for Standards in Human Service Education (1980, 2005, 2009)

National Standards. Council for Standards in Human Service Education (1980, 2005, 2009) Council for Standards in Human Service Education National Standards BACCALAUREATE DEGREE IN HUMAN SERVICES http://www.cshse.org 2010 (1980, 2005, 2009) I. GENERAL PROGRAM CHARACTERISTICS A. Institutional

More information

Positive Behavior Support Facilitator Portfolio Expedited Process Application Contents of Portfolio Packet

Positive Behavior Support Facilitator Portfolio Expedited Process Application Contents of Portfolio Packet Positive Behavior Support Facilitator Portfolio Expedited Process Application Contents of Portfolio Packet Cover Sheet Summary of Requirements Checklist of Portfolio Contents Résumé or Abbreviated Vitae

More information

Executive Board of the United Nations Development Programme and of the United Nations Population Fund

Executive Board of the United Nations Development Programme and of the United Nations Population Fund United Nations DP/2011/3 Executive Board of the United Nations Development Programme and of the United Nations Population Fund Distr.: General 15 November 2010 Original: English First regular session 2011

More information

6 EVALUATE YOUR WORK TOGETHER

6 EVALUATE YOUR WORK TOGETHER 6 EVALUATE YOUR WORK TOGETHER Partnerships can benefit from periodically reviewing their processes, achievements, and challenges. This review of lessons learned is often referred to as an evaluation. Evaluating

More information

"So what? I need results": monitoring and evaluating impossible Family Planning / Reproductive Health Programmes: an Introduction

So what? I need results: monitoring and evaluating impossible Family Planning / Reproductive Health Programmes: an Introduction "So what? I need results": monitoring and evaluating impossible Family Planning / Reproductive Health Programmes: an Introduction Dr. Alfredo L. Fort, MD, PhD Scientist WHO Reproductive Health and Research

More information

Chapter 6 Conclusions and Recommendations

Chapter 6 Conclusions and Recommendations Chapter 6 Conclusions and Recommendations 6.1 Health Commodity Management and Logistics System Performance 6.1.1 Conclusions 1. The frequency of stockouts in the public sector is high, including full-supply

More information

Social Investment. Child Rights and Mining Toolkit. Tool

Social Investment. Child Rights and Mining Toolkit. Tool 10 Child Rights and Mining kit Investing in children s safety, education and health leads to more resilient and peaceful societies in the long term, and is therefore the best foundation for a sustainable

More information

LILAC FLASH LEARNING EVENT

LILAC FLASH LEARNING EVENT LILAC FLASH LEARNING EVENT Friday, January 22 & Saturday, January 23 Dalton LILAC Competencies Communication: articulation of thoughts and experiences to influence, inspire and explain Conceptual Thinking:

More information

Competencies for Canadian Evaluation Practice

Competencies for Canadian Evaluation Practice The Canadian Evaluation Society Competencies for Canadian Evaluation Practice V 11.0 4 16 2010 Page 1 of 15 Introduction This document provides a suite of competencies for evaluation work in Canada. Competencies

More information

Information Guide & FAQs PROFESSIONAL MENTORING PROGRAM CPHRBC.CA

Information Guide & FAQs PROFESSIONAL MENTORING PROGRAM CPHRBC.CA Information Guide & FAQs PROFESSIONAL MENTORING PROGRAM CPHRBC.CA Dear CPHR BC & Yukon member, So you are interested in participating in our Professional Mentoring Program but want to learn more about

More information

Performance Improvement Through Results Based Accountability (RBA)

Performance Improvement Through Results Based Accountability (RBA) Performance Improvement Through Results Based Accountability (RBA) Cynthia McKenna, LCSW Lindsey Boudreau Lisa Pawlik, MA Catholic Charities Archdiocese of Hartford Today s Presentation Results Based Accountability

More information

JOB CLASSIFICATION: A BENEFIT TO ASSESSMENT DEVELOPMENT

JOB CLASSIFICATION: A BENEFIT TO ASSESSMENT DEVELOPMENT JOB CLASSIFICATION: A BENEFIT TO ASSESSMENT DEVELOPMENT BETHANY BOCKETTI LAURA BARANOWSKI TIM MCGONIGLE Caliber Associates A job classification study distinguishes between jobs within a series by identifying

More information

APPLICATION INFORMATION: 1. All five sections of the application must be completed.

APPLICATION INFORMATION: 1. All five sections of the application must be completed. HELP ME GROW SUSTAINABILITY PLANNING CONSULTANT REQUEST FOR QUALIFICATIONS Application Deadline: 5:00 pm on MONDAY, SEPTEMBER 25, 2017 APPLICATION INFORMATION: 1. All five sections of the application must

More information

ICMA PRACTICES FOR EFFECTIVE LOCAL GOVERNMENT LEADERSHIP Approved by the ICMA Executive Board June 2017; effective November 2017

ICMA PRACTICES FOR EFFECTIVE LOCAL GOVERNMENT LEADERSHIP Approved by the ICMA Executive Board June 2017; effective November 2017 Reorganization The Credentialing Advisory Board proposed, and the Leadership Advisory and Executive Boards agreed, that the ICMA Practices should be organized as a narrative rather than a list. The following

More information

REQUEST FOR QUALIFICATIONS TO EVALUATE:

REQUEST FOR QUALIFICATIONS TO EVALUATE: REQUEST FOR QUALIFICATIONS TO EVALUATE: Deadline for Submission of Qualifications: Friday, August 26, 2016 at 5 p.m. Email questions to agingbydesign@hfwcny.org and they will be answered within two business

More information

Preparing for the Strategic Plan

Preparing for the Strategic Plan Preparing for the Strategic Plan Management & Leadership Training Conference 2018 Courtney Kohler, MPA, CCAP, NCRT Senior Associate, Community Action Partnership Natalie Kramer, MSW, NCRT Policy Associate,

More information

CHECKLIST. 6. Consult relevant resources to strengthen your approach. The tool will point you to resources that can support you in taking next steps.

CHECKLIST. 6. Consult relevant resources to strengthen your approach. The tool will point you to resources that can support you in taking next steps. WEPs Gender Gap Analysis Tool From Principles to Practice CHECKLIST This guidance document is to help companies understand the steps to prepare and complete the self-assessment. It is recommended that

More information

Clinical trial information leaflet and consent

Clinical trial information leaflet and consent Informed consent 1(7) Clinical trial information leaflet and consent General You must provide sufficient information on the rights of clinical trial subjects, the purpose and nature of the trial, the methodologies

More information

Working Party on Aid Evaluation

Working Party on Aid Evaluation For Official Use DCD/DAC/EV(2001)3 DCD/DAC/EV(2001)3 For Official Use Organisation de Coopération et de Développement Economiques Organisation for Economic Co-operation and Development 24-Apr-2001 English

More information

National Joint Council for Local Government Services

National Joint Council for Local Government Services National Joint Council for Local Government Services Employers Secretary: Sarah Messenger Local Government House, Smith Square London, SW1P 3HZ Tel: 00 7187 7373 Fax: 00 7664 3030 To: Chief Executives

More information

Hafan Cymru Job Description

Hafan Cymru Job Description Job title Grade / Salary Scale Location Contract hours Working Pattern Responsible to Responsible for (staff / finance / Service Users /Customers Job Purpose Competency and Level required Project Manager

More information

The 360-Degree Assessment:

The 360-Degree Assessment: WHITE PAPER WHITE PAPER The : A Tool That Can Help Your Organization Maximize Human Potential CPS HR Consulting 241 Lathrop Way Sacramento, CA 95815 t: 916.263.3600 f: 916.263.3520 www.cpshr.us INTRODUCTION

More information

Diversity Inclusion Equity. Excellence. Human Resources, Diversity, and Multicultural Affairs

Diversity Inclusion Equity. Excellence. Human Resources, Diversity, and Multicultural Affairs Diversity Inclusion Equity Excellence Human Resources, Diversity, and Multicultural Affairs We seek diversity not only because it s the right thing to do, but because it is the smart thing to do. David

More information

Building Capacity Through Employee Engagement

Building Capacity Through Employee Engagement Building Capacity Through Employee Engagement Executing Your Vision 1 Employee Engagement Is Pivotal Employee engagement can become a key competitive advantage or a primary weakness. (Deloitte) Effective

More information

United Way of Lancaster County Collective Impact Initiative. Year One Evaluation Report

United Way of Lancaster County Collective Impact Initiative. Year One Evaluation Report United Way of Lancaster County Collective Impact Initiative July 2016 Table of Contents Executive Summary... 1 Overall Evaluation Processes from Year One Collaboration with CI Directors... 2 Development

More information

TERMS OF REFERENCE. External Evaluation of Women s front of Norway/CATW Project Combating Trafficking in Women for Commercial Sexual Exploitation

TERMS OF REFERENCE. External Evaluation of Women s front of Norway/CATW Project Combating Trafficking in Women for Commercial Sexual Exploitation General objective Locations Audience TERMS OF REFERENCE External Evaluation of Women s front of Norway/CATW Project Combating Trafficking in Women for Commercial Sexual Exploitation Time scope 2015-2017

More information

Selecting the Program That s Right for You: A Feasibility Assessment Tool

Selecting the Program That s Right for You: A Feasibility Assessment Tool Selecting the Program That s Right for You: A Feasibility Assessment Tool Funded by the Center for Substance Abuse Prevention (CSAP), Substance Abuse and Mental Health Services Administration, Department

More information

Introduction Seven Key Components of M&E Planning

Introduction Seven Key Components of M&E Planning Introduction Seven Key Components of M&E Planning This edition of Short Cuts is intended to provide concise guidance needed to develop a comprehensive monitoring and evaluation (M&E) system for international

More information