EVALUATING THE IMPACT OF MEDIA INTERVENTIONS IN CONFLICT COUNTRIES CAUX CONFERENCE CENTER, SWITZERLAND DECEMBER 13 17, 2010

Size: px
Start display at page:

Download "EVALUATING THE IMPACT OF MEDIA INTERVENTIONS IN CONFLICT COUNTRIES CAUX CONFERENCE CENTER, SWITZERLAND DECEMBER 13 17, 2010"

Transcription

1 EVALUATING THE IMPACT OF MEDIA INTERVENTIONS IN CONFLICT COUNTRIES CAUX CONFERENCE CENTER, SWITZERLAND DECEMBER 13 17, 2010

2 Executive Summary The purpose of this paper is to start the conversation about methods of evaluating media interventions in conflict countries. What makes monitoring and evaluating the impact of media assistance programs difficult? There are many different answers but four related answers in particular are of interest for this paper. First, evaluating media impact is difficult because of the long-term nature of media development. Second, the lack of accepted indicators of impact that go beyond merely counting activities make showing the true outcome of programs in quarterly or monthly reports difficult. Third, there are too few people trained in the data collection and analysis stages of M&E (monitoring and evaluation). A fourth and cross cutting issue is that a process of inquiry that standardizes research methods appears to be lacking in the media development sector. This paper suggests one way forward to address the four issues noted above. It is premised on the observation that media assistance programs are difficult to evaluate because donors, media implementers, and media organizations themselves have yet to embrace the foundations of research inquiry and measurement. This gap is accentuated when we acknowledge the difficult field environments in which media assistance organizations operate. Measuring the impact of interventions is a challenge but it is not an insurmountable challenge. One way to minimize challenges is to take a holistic approach to M&E. Monitoring and evaluation needs to be a part of all stages of the media assistance process. Donors should consider it when they write RFAs (Request for Application). M&E needs to be considered at the conception stage of a proposal, be written into a proposal, and adequately resourced in the budget. From the first day of the grant/contract award, M&E needs to be enacted at all levels of the winning organization and in the relationship with the donor. The following pages explain a process of inquiry and planning that will help organizations to integrate M&E into their proposals and project start up. To help guide useful M&E, the paper proposes five different social science research methodologies that can be used in monitoring and evaluation higher-level media assistance program outcomes. Staff members can use these methods to collect and analyze outcomes of media assistance programs: Content Analysis provides evidence long term training impact that goes beyond merely reporting number of journalists trained or assistance to non-state media organizations. It can provide a participatory method for making media organizations more aware of their overall performance in creating excellent news and information that is valuable to the public. Delphi Panels provide participatory expert opinion, both qualitatively and quantitatively, on sector-wide issues. It can be used to identify problems, rank order possible solutions, or evaluate progress toward higher-level outcome objectives such as legal environments, economic sustainability, and future directions of the media sector. Focus Groups provide qualitative evidence of the impact of programs on the end users (the public). They are especially valuable when combined (triangulation) with a quantitative method that provides numeric data of public perceptions and behaviors.

3 Network Analysis provides quantitative and visual representations of the relationships among organizations that operate in the media sector. It can be used as a diagnostic tool in the baseline stage and then to track progress in a sector. It can answer questions about capacity, collaboration, and inter-organization cooperation. Survey Research provides quantitative evidence of awareness, attitudes and behaviors of the end users of media assistance programs (the public). Survey data can identify most trusted media organizations and detect consumption patterns that can be used to better develop content that draws viewers/listeners/readers and fosters the basis of a vigorous advertising market in a country. These empirical methods can be streamlined for use in the field. They can provide reliable data that can help media development organizations plan, adjust, and evaluate impact. Methods of Evaluating Media Interventions in Conflict Countries Page 1

4 Introduction Conducting field research in a conflict environment is always difficult. Researchers have to deal with security concerns, cultural uncertainties, and political issues. The problem of field research is compounded when it is part of a monitoring and evaluation plan. Monitoring and evaluation requires that data be collected at certain times; and the neat timeline of the work plan is not always realistic for data collection. While there are many different challenges to methods of evaluation in conflict countries, the following 10 statements provide a starting point for discussions among donors and media development organizations: 1) Conflict environments present challenges to data collection including security and the willingness of people to participate in research. 2) Researchers with academic training need to be flexible and adjust to the realities of field research. Control, a foundation of social science, is difficult to maintain in light of complex factors that influence the execution of a research study in a conflict zone. 3) Culture influences how research is conducted and how, and if, citizens participate in research. 4) In-country field staff may not have the skills or experience to conduct reliable research. There are too few researchers who work in M&E. 5) There is too little time and too little money dedicated to monitoring. M&E requirements outlined in RFAs are often not fully funded. 6) Home office support often does not assist in monitoring and evaluation. 7) Different forms of media development projects require different types of data collection. No one method is sufficient for telling the story of a media program. 8) There is a missed connection between those who feed information into the M&E process and the use of M&E data for programmatic decision-making. 9) Baselines are often not collected in the rush to program start up. 10) Causal arguments are difficult to make between a media development intervention and positive change in the media environment. Many of these challenges can be overcome. Media development organizations need to take a holistic approach that involves home office support, training staff in the field, regular use of monitoring data in programmatic decision-making, and meaningful evaluations that hold partners and grantees accountable to the results generated by M&E. The process of M&E is often viewed as a tactical effort. It is something that has to be done as part of the cooperative agreement or contract. As an afterthought, consultants are often called in at the mid point or end of project to identify the impact of a project. A common request is: please tell us what kind of impact we are having on the media environment. Without baseline data and standardized data collection methods, it is difficult to write a report that does more than repeat information found in the quarterly reports: the number of people trained, number and type of sub grants etc. After 15 years of working as an evaluator, I posit that M&E might be more valued if it were viewed as a process of inquiry. The process of inquiry poses questions, outlines methods, and articulates what will count as evidence in a research study. There are a variety of social science methods and research questions that can help media development organizations to Methods of Evaluating Media Interventions in Conflict Countries Page 2

5 plan, monitor, evaluate and adjust programmatic activities. This paper will identify and explain the different empirical methods that can be adapted to make easy-to-use, field proven, and reliable M&E tools. The paper will conclude with recommendations and conclusions for donors and media implementers to holistically plan the process of M&E inquiry. The social science research methods that are most appropriate for evaluating media interventions in conflict countries include Content Analysis, Delphi Studies, Focus Groups, Network Analysis, and Survey Research. Other useful methods exist but are beyond the scope of this short discussion paper. Content Analysis Background on the Method Content analysis is a quantitative research technique for the objective, systematic and quantitative description of the manifest content of communication (Berelson, 1952, p. 18). It allows the researcher to make inferences by objectively and systematically identifying specified characteristics of messages (Holsti, 1969, p. 14). The method measures the impact of both mid-level and higher level M&E objectives. The key to content analysis is to create a scientific way to identify and measure specific content features of news. The method should be participatory in that representatives from media organizations can help to develop the categories for analysis and participate in the development of a country specific rating scale that reflects the level of development of that particular media environment. Additionally, media support organizations (NGOs) can be trained in content analysis as a way to increase their sustainability. Research Questions and IRs/Components that Content Analysis Can Inform Many media development programs identify improved professionalism in the media as an intermediate result or component. This mid level result is premised on the idea that training creates improved content and in the long run, improved content helps citizens make better decisions. Content analysis methodology answers questions that ask if, and how, journalists or news organizations are improving their content. It also shows how news coverage can change over time. Steps to Conduct a Content Analysis 1. Identify Universe of Texts (What type of media) 2. Identify Unit of Analysis (Media Organization, Journalist, Program, Story) 3. Identify Categories (Deductive or Inductive development) 4. Practice Coding Until Coders Achieve Intercoder Reliability 5. Code Units of Analysis 6. Analyze and Use Results to Improve Training Methods of Evaluating Media Interventions in Conflict Countries Page 3

6 Using Content Analysis to Evaluate M&E Outcomes Media implementers are already reviewing the content of their partners and grantees. Someone in the field office reads, watches or listens to the news and programming of partners. Content analysis is merely systematizing the process. Content analysis methodology addresses M&E indicators such as improved professionalism of targeted media. Content analysis findings can be treated as an outcome indicator. In past M&E projects, content analysis has been used to examine specific features of excellent news generated by media partners, assist junior reporters in writing better stories, and evaluate long-term media training outcomes (pre and post content analysis). Issues in Content Analysis Content analysis requires medium to high resources. Coders need to be trained (sometimes up to 40 hours). Media organizations have to commit to providing samples of their content and coders have to provide reliable scores and useful qualitative comments to help the implementer to makes sense of the numbers. Organizations need to decide if coding will be assigned to a member of the team or if outside experts will be hired. Baselines need to be collected before significant trainings occur. Finally, content analysis findings are often disputed by the media organizations. Lessons Learned In the Field I have used content analysis in six different conflict or transitional countries to measure improved professionalism of the media. In my capacity as a consultant, I have supervised the analysis of tens of thousands of stories in print, online, television, and radio outlets. I have trained coders and written reports based on their findings. I would like to share two lessons learned: 1) Allow the media organizations that are being monitoring to help shape the coding categories and operationalizations. The best way to ensure that media partners benefit from content analysis is to encourage them to buy in to the method as a business tool that will help them to develop more marketable content for viewers and advertisers. 2) Front load the training of the coders and devote resources to retraining every six months. Content analysis training needs to last between hours to ensure intercoder reliability. Online training has not been effective for me. It is a good idea to revisit the categories and definitions every six months to prompt useful discussions of what is working and what needs to be revised. Coders should be trained in the analysis and reporting of data so they can see the connections between their scores and programmatic planning. Take Away Content analysis methodology provides a quantitative and qualitative tool to create a baseline measure of content, track changes in media professionalism, and plan future trainings. The field of media development should standardize content analysis as an accepted measure of media content. Donors should support it and implementers should require it in any project that has a media professionalism component. Methods of Evaluating Media Interventions in Conflict Countries Page 4

7 The Delphi Method Background on the Method The Delphi method approach consists of questioning experts by means of multiple questionnaires to reveal consensus on a topic. Experts must have specific knowledge on the subject and be prepared to become involved in this type of procedure. The panel can range from 15 to 50 persons. A snowball or network sample can help to identify the experts. In the field, you can use a smaller sample based on time and availability. Research Questions and IRs/Components that the Delphi Method Can Inform Delphi panels answer questions about expert perceptions about a sector or progress in the sector. For instance, some media development programs work toward improved regulatory environment or improved sustainability of media organizations. These types of IRs (information retrievals) are higher-level outcomes and there are very few tools to actually capture progress toward them. The Delphi method provides expert opinion (both qualitative and quantitative) to assess improvements. Steps to Conduct a Delphi 1) Identify experts 2) Questionnaire #1 includes two or three semi-open and open questions and some Likert or quantitative questions. 3) Analysis of 1st questionnaire determines the general tendencies of the group and the most extreme answers. 4) Questionnaire #2 shares a brief written report outlined the panel s perceptions and levels of agreement and disagreement. Each participant is asked to provide new answers to the questions and to justify why their answer differs from the general agreement of the group. 5) Questionnaire #3 (not always needed) is specifically for those who have extreme opinions and it is one last chance to get participants to explain their position. This is a key step to avoid groupthink. Sometimes these answers are the most valuable. 6) Summary and final report highlights general areas of agreement and also identifies areas of disagreement. The report can integrate quotes from the participants and also present the percentages of agreement on topics. Using the Delphi Method to Evaluate M&E Outcomes Delphi panels solicit opinions from experts. The Delphi method can be used as an outcome indicator. It can be useful for gathering data for mid level media assistance objectives such as improved regulatory environment. The use of a Delphi panel at the beginning of a program can set an important qualitative and quantitative baseline that can be used to track progress toward program impact. Delphi panels can be convened each year to provide data for annual indicators. Issues to Consider When Using Delphi Panels for M&E Delphi methods require a medium amount of time and organizational resources. At least one dedicated staff member should be responsible for communication and facilitation of Methods of Evaluating Media Interventions in Conflict Countries Page 5

8 the panel. A Delphi panel may take up to three months to complete, thus making it difficult to create a baseline in the first quarter of a new project. Delphi panels require that experts weigh in on the topic of consideration. Experts are usually busy people with multiple obligations. The Delphi method needs to ensure that the same experts participate in all aspects of the panel. Participant attrition often occurs when members drop out or fail to return questionnaires. Small honorariums paid at the end of the panel can help to retain participants. Lessons Learned In the Field I have not conducted a Delphi method but have participated in a few of them as an academic expert. A review of articles that have used the method suggest the following lesson learned: 1) The first summary is the most important part of securing long-term participation. The first summary needs to be concise, use a variety of quotes, and capture the complexities of the situation. A poorly written analysis is the fastest way to lose participation in round 2. Take Away Delphi panels can set an important qualitative and quantitative baseline that can be used to track progress toward higher-level program outcomes. Focus Groups Background on the Method Focus groups are a useful tool to reveal underlying cognitive or ideological premises that structure arguments, the ways in which various discourses rooted in particular contexts and given experiences are brought to bear on interpretations (Lunt & Livingstone, 1996, p. 96). As a research method, focus groups can provide a qualitative evaluation about the impact of a program or intervention (Morgan, 1997, p. 3). Research Questions and IRs/Components that Focus Group Data Can Inform Focus groups are useful for answering questions about how the end users of media messages the public perceive and use media. This data can directly answer mission or larger objectives such as citizens have access to reliable information to make political and economic decisions. Steps to Conduct Focus Groups Homogenous focus groups around demographic categories such as gender, age, ethnicity, and educational levels are best for creating a comfortable environment. Local NGO partners can usually find focus group participants and the field office must decide to if they want to provide monetary support to the NGO or to the participants. Refreshments are always offered. As a capacity building function, expert consultants can train local nationals how to conduct focus groups. Methods of Evaluating Media Interventions in Conflict Countries Page 6

9 Using Focus Groups to Evaluate M&E Outcomes Focus groups can be used to address indicators that seek to measure public perceptions and experiences on certain topics (improved security, health messages, access to information). Focus group data can be treated as an outcome indicator, if a focus group baseline has been set. In M&E, the same people do not have to be involved in the periodic focus groups convened throughout the project. Issues to Consider When Using Focus Groups in M&E Focus group methods can require medium to high amounts of time and organizational resources. A lot of attention must go into the set up, facilitation, and synthesize the findings of the focus group. A trained moderator may charge $500 to facilitate one group and then charge even more money to provide a useful report. More than one focus group is usually needed to capture the perceptions of different groups (young, old, professional, urban, rural, ethnicities). Focus group data are not generalizeable. You cannot generalize the words or the experiences of those who attended the focus group to the wider population. But, focus groups are valuable because they provide the narratives and experiences of the people who are the end users of media development programs. Focus groups with certain groups are difficult to convene. Women may need to bring their husbands or children to the meeting. Fears of being recorded may make honest answers to questions more problematic. Finally, there is a tension in a conflict-prone environment between the use of private places to discuss experiences and the allegations that private meetings are security risks. Lessons Learned in the Field After conducting focus groups in several different conflict environments, I have concluded the following lessons learned: 1) Focus group questions and secondary probes need to specifically ask for information that can be used to address the indicator of interest. If the moderator asks too many open questions, tell me about the answers will not be specific enough. A better way to ask about media usage is to ask tell me how you used information from the media to... 2) Homogeneity (gathering similar people) means different things to different people. I have worked with dozens of NGOs to organize focus groups. To recruit the correct types of people, they need to know as many specifics as possible. The researcher should require a list of the demographics of the people invited. In one situation, I asked for a group of demobilized soldiers and got a group of 17 men ranging from 17 years old to 70 years old. The NGO invited men that needed the money. For the NGO, getting people, any people into the focus group, was more important than getting the right people. 3) There should be two people who facilitate a focus group. At least one of them needs to be fluent in the language of the participants. The focus group moderator needs to be assisted by someone who can manage the people part of the focus group. Methods of Evaluating Media Interventions in Conflict Countries Page 7

10 4) You need to gain permission to record the focus group. If the situation does not permit a recording, at least two people need to take notes. 5) Triangulation of focus group answers with another data set (survey, content analysis) is a best practice. Take Away Focus groups are useful for answering questions about how specific publics perceive and use media. The words of the focus group participants provide qualitative data that provide a narrative to contextualize program outcomes. Network Analysis Background on the Method Social network methodology studies a system of actors/organizations that have relationships. The structure of a network (relationships among organizations) influences the ability of one organization, or a group of organizations, to function, solve problems and enact change in a conflict environment. In many conflict environments, traditional relationships have been damaged and new relationships are required. Network analysis can help to build and strengthen relationships. Network analysis uses survey questions to ask members of a system to report on their relationships with other organizations. Through a software package such as UCINET, a picture of the network emerges. The data from network analysis provide a tool to measure the density or strength of a sector (such as media support organizations). More importantly for M&E, the data can diagnose where there are missed opportunities for resource sharing and identify best and worst partners in a system. Research Questions and IRs/Components that that Network Analysis Can Answer Network analysis provides answers to macro sector questions about mobilizing and building cooperative relationships among organizations. Many media assistance programs identify increased capacity of a sector as a mid level or intermediate result. To date, there have been no tools to measure this outcome. Network analysis can provide a quantitative measure indicating the strength of relationships in a sector (legal, media support). It can be used to identify baseline relationships and track changes throughout the course of a program that help the implementer to show how program activities are creating increased capacity. Steps to Conduct a Network Analysis The network can be identified by a few different methods. The implementer or donor can create a roster of partner or target organizations. This will create a list of organizations that theoretically should be communicating and cooperating. In a larger system, the network can be identified using a name generating methodology. Participants can be asked in interviews to identify the organizations they interact with or are dependent on for resources. Methods of Evaluating Media Interventions in Conflict Countries Page 8

11 Once the network has been identified, a survey asks about the characteristics of the named organizations. Possible Likert type questions may inquire about trust, how much information or resources they receive from organizations, the cooperative or competitive nature of the other organizations, and the degree to which each organization in the network is viewed to be important to the sector. The results generated describe the nature of the relationships among organizations. Using Network Analysis to Evaluate M&E Outcomes Network analysis can be used to address indicators that seek to measure the strength or evolution of a sector. For instance, many development projects seek to build up the capacity of a media law sector or develop a network of broadcast stations that share content and advertising revenue. It is difficult to quantitatively measure progress towards these objectives. Network analysis is valuable because it can show the impact of project activities that bring groups together, foster improved communication and cooperation, and improve the capacity of a sector. Network analysis data can be treated as an outcome indicator if a baseline has been set. Lessons Learned in the Field I have been involved in network analysis projects in three different conflict prone societies. Lessons learned include: 1) It takes time to identify all of the members of a system. Interviews or a snowball sample are best for identifying the most appropriate organizations. 2) It is best when more than one person from each organization answers the network survey. The researcher can average scores to create one organizational score about the other organizations in the system. This minimizes subjectivity. 3) Certain types of organizations have a policy against participating in surveys. Legal groups are particularly difficult to recruit. An interview that asks the same questions as the network survey is one way to get reluctant members of the system to participate. 4) Some organizations figure out that the network tool is being used to allocate resources and they can try to give their close partners better scores to help them out. 5) If network interviews/surveys are not possible due to security issues, the researcher can create a paperwork network tool that tracks joint activities of NGOs and organizations. In some places, Web site links have been used as a proxy measure of relationships. More links means more relationships. 6) A picture is worth a thousand words. One image generated by a network analysis provides valuable insight into a network. Images make great additions to reports. Take Away There are no real tools to measure intermediate results of the improved capacity of key sectors of the media environment. Network analysis is valuable because it can show the Methods of Evaluating Media Interventions in Conflict Countries Page 9

12 impact of project activities that bring groups together, foster improved communication and cooperation, and improve the capacity of a sector. Survey Research Background on the Method Survey research allows an organization to measure awareness, attitudes and behaviors of a target population. Survey research is most reliable when it uses random sampling. The process of random sampling means that every person in a target population has an equal chance of being selected to participate in the survey. Random selection means that a researcher can study a small group but can generalize the findings to the larger population. There are a variety of survey types. Polls are short, quick surveys that seek to provide a picture of people s opinions. Surveys are longer, more in-depth and tap into awareness, attitudes and behaviors. Most surveys include closed ended questions that force respondents to select a specific answer. The value of closed ended survey questions is that the researcher can tabulate the results and draw conclusions. Some surveys include openended questions that ask respondents why they believe or act a certain way. Surveys that include open-ended questions are difficult to analyze because it takes a lot of time and expertise to categorize many different answers into manageable categories. Research Questions and IRs/Components that that Surveys Can Answer Surveys provide data that can address intermediate results such as public has increased access to professional media content or members are satisfied with organizational leadership and association activities. Surveys can also provide data for outcome indicators such as public perceptions, reach and citizen access to media, media reputation and influence, and media consumption patterns. Data can be used to better target programming to increase advertising revenue. Steps to Conduct a Survey in a Conflict Environment Surveys can be used to target people within an organization (members, volunteers, employees) or external publics (citizens, end users, political or civil society leaders). The first steps are to identify the target public, identify adequate sample size, and identify a sampling scheme to reach those people. In many conflict environments, data about demographics can be difficult to obtain. The research team may need to use a less scientific method to draw a sample. Once the target public and sample size has been identified, the researcher needs to create the actual survey. Survey questions need to address the indicators in the PMP (performance monitoring plan). Extra questions may be useful but shorter surveys are better than longer ones. Survey takers need to be trained and the survey must be reviewed multiple times to ensure that it actually measures what it seeks to measure. A pilot test is highly desirable and allows the research team to fine tune the survey questions and sampling. Methods of Evaluating Media Interventions in Conflict Countries Page 10

13 Intercept surveys allow survey interviewers to randomly select people who are walking in high traffic areas. House to house surveys allow interviewers to survey different neighborhoods and ensure that all demographics are included in the research. Surveys provide quantitative data about awareness levels, attitudes and behaviors. Audience surveys can provide information about viewing/listening/reading preferences that can help media organizations develop more targeted content. Member surveys, such as those for media support organizations, can tell the leadership what the members want and thus increase the chances for dues paying members to gain desired benefits. Surveys can be conducted face to face, by phone or fax, and now, via the Internet. Using Surveys to Evaluate M&E Outcomes Surveys require a medium to high amount of time and organizational resources. Data addresses M&E indicators such as public perceptions or member satisfaction. The method can also be used to gather audience data (share, rating, patterns) that can be used by media organizations to develop more desirable content and thus bring in more advertising revenue. Survey findings can be treated as an outcome indicator in M&E. Baseline data is especially important to be able to detect changes in patterns. Issues to Consider When Using Survey Research in M&E The most important issue to recognize is that survey research is expensive. Even in the poorest economies, it costs a lot of money to hire trained survey takers. A constant question in planning survey research is how many people do we need to survey to get reliable results? Most Western-administered surveys are based on responses. The survey participants are carefully selected to represent different demographic groups. In the field, a random sample of 1500 people may be improbable. Thus, the purpose of the survey will dictate how many randomly selected people need to be sampled. According to survey researchers, the magic number for a sample is based on the target population and your desired confidence level. If you are not breaking the data down by ethnicity or gender, then a random sample of about 400 people provides useful data to detect general trends. Can a sample of fewer people be useful? Yes, as long as the findings are interpreted cautiously. There are some basic principles to be considered when planning a survey. A survey will take one month to develop. In many conflict environments, a government office or officer in the security forces may need to approve the survey and issue a permit. Survey takers need at least 15 hours of training to understand the intent behind the questions and understand the sampling rules. In places where there are low levels of literacy, surveys will need to be read out loud. This adds on extra time per survey. Survey data needs to be entered accurately into a database using excel or SPSS software. The data needs to be screened and cleaned and unusable surveys thrown out before any analysis occurs. Finally, the survey findings need to be analyzed in a way that actually addresses the indicators in the PMP. If the PMP indicator asks for percentages, then percentages need to be reported. If the indicator requires a mean or average, then that is what needs to be reported. My past Methods of Evaluating Media Interventions in Conflict Countries Page 11

14 experience suggest that a survey will take a minimum of three months to develop, pilot test, conduct, and analyze the data. Lessons Learned in the Field I have participated in a dozen field surveys in difficult environments. Each time, I have had to make adjustments to the method. The number of lessons learned could fill a book. I have selected a few for this paper. 1) Surveys will take twice as long as planned and cost twice as much as expected. Translation and back translation are time consuming and expensive. 2) An expert needs to set up the Excel or SPSS data file. The data file should be used to input the pilot test results. Modifications will also need to be made to the file and the survey after the pilot test. 3) All survey takers need to be trained and all training workshops need to include an actual field component. Teaching random sampling in the classroom is different than teaching it in the field. 4) A supervisor needs to be actively involved in the survey process. Multiple steps need to be taken to ensure the integrity of the survey process. 5) Entrepreneurial organizations and individuals will sub contract out parts of the survey process. There should be clear clauses in any contract that forbid the sub contracting of survey work. 6) Standard survey questions that work well in the West fail in the field. Likert-type questions need an explanation. You cannot create mean/average scores on certain types of questions. 7) Young people are more willing to answer surveys and thus will skew the findings. Rural populations are often under sampled. In some cultures, women will outnumber men on responses and in other cultures males will dominate survey the sample. 8) In conflict zones, survey takers need to work in pairs and be prepped for what if scenarios. 9) In traditional cultures, females should sample females and males should survey males. 10) If a large-scale survey is not feasible then a small survey sample can be supplemented by triangulating the survey findings with another method such as a focus group or in depth interviews. The qualitative data can add layers of context to the survey data. 11) Many programs will try to piggyback onto another survey. These surveys often do not actually occur or the quality of the data is low. Methods of Evaluating Media Interventions in Conflict Countries Page 12

15 12) Phone surveys and online surveys are very good for membership surveys. They are not good for public opinion surveys. 13) New SMS text message surveys are now possible. Short questionnaires can be conducted to targeted groups (those who have the phone, capability). See souktel.org. Take Away Conducting a survey is one of the best ways to understand how the public is benefiting from media assistance programs. Careful planning is required to ensure that surveys provide reliable results. Reflections on Monitoring and Evaluation Research Methods in Conflict Environments This paper has argued that the M&E process might be better conceptualized as a process of inquiry that clearly articulates research questions, methods, and types of evidence required to prove impact of media assistance programs. The five aforementioned methods are based on an empirical view of research. Each method has strengths and limitations and every conflict environment is subject to unique factors that will influence the nature of the media development program and the evaluation method used to measure results. What Types of Questions Guide Media Assistance Program Evaluations? The process of inquiry suggests that useful research incorporates three steps. It begins with clear research questions, identifies appropriate methods, and states in advance what data will count as evidence to answer the research questions. Media impact questions have generally been low level questions about number of people trained etc. It is now time to move on to larger impact questions that provide multiple layers of data that can be used in planning and evaluating media programs. Donor Responsibilities and Future Steps RFAs often state what types of M&E indicators are required. Macro level indicators such as the IREX Media Sustainability Index (MSI) or Freedom House Freedom of the Press scores are often integrated into RFAs and thus become part of the proposal process. Large, sector wide scores from the MSI or Freedom House are one part of an assessment of media development in a nation. But, they are only collected once a year and often do not reflect the expected results of a short-term (12-18 months) media program. Additionally, there is a wide variance in how donors fund M&E. If a donor wants a public perception survey, the total cost and time required for research permit, training survey takers, conducting the survey, analyzing the data and then using the findings for programmatic revisions may exceed three to five months and cost several thousand dollars (at minimum). If the donor wants evidence of sector wide development, a network analysis is necessary. Unfunded mandates for M&E mean that field staff may take short cuts and thus miss out on the intention of the donor s M&E mandate: to use data to evaluate impact and inform future program activities. 1) Donors have the right to require certain M&E activities but they also have responsibility to fund them adequately. Methods of Evaluating Media Interventions in Conflict Countries Page 13

16 2) Donors have a role to play in standardizing M&E. If they push for the evolution of M&E, it will happen. Considerations for Media Implementers Data collection for M&E should be an organization-wide activity. The home office needs to devote personnel or financial resources to help the field offices collect relevant and useful information. An organization-wide plan would allow home offices to aggregate country level data and draw regional or even global level conclusions about the impact of their assistance efforts. RFAs may only ask for the collection of basic data (training numbers) but implementing organizations can go beyond what is required and develop measures that provide useful information to adjust programming and provide evidence of impact. 1) As the media sector in a conflict environment becomes more developed (perhaps after several projects) then the anticipated impact of media assistance programs needs to be at a higher level (capacity building, public perception, sustainability of media and media support organizations). Baselines set in earlier projects can be revisited and become valuable tools for tracking progress toward macro level impacts. 2) Implementers need to develop organization-wide indicators and tools that can be quickly adapted for field use. Home offices need to be involved in all parts of M&E and treat new projects as an opportunity to train all program staff in the field to support research. Having a dedicated M&E officer is only the first step to systematizing M&E in the field. It takes a village to collect, analyze, and use M&E data for programmatic improvements. 3) Implementers should develop relationships with researchers and universities to create opportunities for collaboration. Concluding Thoughts on the Evolution of Evaluation After nearly 15 years in the media research business, I have seen significant progress in monitoring and evaluation. Fifteen years ago, media projects had little M&E and scrambled at end of project to find ways to summarize their results. Today, results frameworks are a mainstay of media assistance programs. The next step is to have the media development sector jointly agree about which methods and indicators are the most valuable ways of measuring program impact. The process of inquiry, with its focus on guiding questions and research methods, may provide the next step in the evolution of evaluation. The methods outlined in this paper provide the foundation for strengthening RFAs and the proposals that respond to them. The paper began with a general question: What makes monitoring and evaluating the impact of media assistance programs difficult? The answer is that the media development sector has not developed the tools that it needs to plan, monitor, and adjust its assistance efforts. Creating a process of inquiry that makes monitoring and evaluation a core activity will provide donors and implementers with the Methods of Evaluating Media Interventions in Conflict Countries Page 14

17 conceptual and methodological tools that can be used to truly pinpoint impact and adjust programs when necessary. At the heart of most media assistance programs is the belief that people across the world have the potential to lead improved lives if they have access to useful, reliable information. This information will allow them to make decisions that empower their communities. Yet, most media assistance programs are under-resourced and rarely measure the impact that their activities have on the public. The future of monitoring and evaluation is to develop a holistic process that treats media development as a multi layered process that trains journalists, assists non-state media organizations to become economically viable entities, develops the capacity of media support organizations, supports a legal environment that protects free speech, and serves the citizens of that nation. Multiple methods will allow implementers and donors to truly gauge the impact of such assistance programs. References Berelson, B. (1952).Content analysis in communication research. New York: Free Press. Holsti, O.R. (1969). Content analysis for the social sciences and humanities. Reading, MA: Addison-Wesley. Lunt, P & Livingstone, S (1996) Rethinking the Focus Group in Media and Communications Research. Journal of communication, 46(2): Morgan, DL. (1997) Focus Groups as Qualitative Research. London: Sage. Methods of Evaluating Media Interventions in Conflict Countries Page 15

HOW TO WRITE A WINNING PROPOSAL

HOW TO WRITE A WINNING PROPOSAL HOW TO WRITE A WINNING PROPOSAL WHAT IS A PROPOSAL? A proposal is a picture of a project, it is NOT the project. In that sense, it is based on your project plan but may be quite different from the Project

More information

This glossary will be available to registered Point K users online at beginning in mid-may, 2005.

This glossary will be available to registered Point K users online at  beginning in mid-may, 2005. Glossary: Nonprofit Planning & Evaluation This glossary will be available to registered Point K users online at www.innonet.org/?module=glossary beginning in mid-may, 2005. A ACTIVITIES The actions an

More information

Workshop II Project Management

Workshop II Project Management Workshop II Project Management UNITAR-HIROSHIMA FELLOWSHIP FOR AFGHANISTAN 2007 Introduction to Project Management 15 17 August 2007, Dehradun, India Presented by: Jobaid Kabir, Ph.D. Fellowship Program

More information

Module 6 Introduction to Measurement

Module 6 Introduction to Measurement Module 6 Introduction to Measurement 1 Module Objectives By the end of this module participants will be able to: 1 2 Make initial decisions regarding when you need to collect data and on whom or what you

More information

BALLOT MEASURE ASSISTANCE APPLICATION

BALLOT MEASURE ASSISTANCE APPLICATION BALLOT MEASURE ASSISTANCE APPLICATION Guidelines for the NEA Ballot Measures/Legislative Crises Fund require that affiliate requests for assistance be drafted in consultation with the NEA Campaigns and

More information

Category 1 Consumer Input & Involvement

Category 1 Consumer Input & Involvement TECHNICAL ASSISTANCE GUIDE COE DEVELOPED CSBG ORGANIZATIONAL STANDARDS Category 1 Consumer Input & Involvement for Public CAAs Community Action Partnership 1140 Connecticut Avenue, NW, Suite 1210 Washington,

More information

Safety Perception / Cultural Surveys

Safety Perception / Cultural Surveys Safety Perception / Cultural Surveys believes in incorporating safety, health, environmental and system management principles that address total integration, thus ensuring continuous improvement, equal

More information

Qm 2 nonprofits

Qm 2 nonprofits By Anita Nowery Durel, CFRE THE MISSION AND VISION Nonprofit organizations are formed around concerns or issues when someone or a group determines ways to solve the problem by creating options and opportunities.

More information

Designing a Program to Demonstrate Impact

Designing a Program to Demonstrate Impact Designing a Program to Demonstrate Impact John Blevins, Emory University Jill Olivier, University of Cape Town 4 December 2018 Learning Objectives This webinar will include content on the following objectives:

More information

Organizational capacity Assessment tool 1

Organizational capacity Assessment tool 1 Organizational capacity Assessment tool 1 Purpose This assessment tool is intended to guide HI staff and members of local structures to assess organisational capacity. The questions highlight three key

More information

KEY TERMS & DEFINITIONS IN STRATEGIC PLANNING

KEY TERMS & DEFINITIONS IN STRATEGIC PLANNING KEY TERMS & DEFINITIONS IN STRATEGIC PLANNING Term Definition Related Terms Academic An organized sequence or grouping of courses leading to a defined objective - such as a major, degree, certificate,

More information

SEGMENTATION BENEFITS OF SEGMENTATION

SEGMENTATION BENEFITS OF SEGMENTATION SEGMENTATION BENEFITS OF SEGMENTATION Segmentation is an analytical process that identifies key segments within a target market and determines which segments are of higher or lower priority for a brand.

More information

Listening to The Voice of The Customer 1

Listening to The Voice of The Customer 1 Listening to The Voice of The Customer 1 To ensure that products and services satisfy customer needs, it is essential for a firm to gather their customers voice. Stated differently, listening to customers

More information

Strengthening Orphan and Vulnerable Children Programs with Data

Strengthening Orphan and Vulnerable Children Programs with Data BEST PRACTICE Creating a Culture of Data Demand and Use Strengthening Orphan and Vulnerable Children Programs with Data Significant human and financial resources have been invested worldwide in the collection

More information

Resource Library Banque de ressources

Resource Library Banque de ressources Resource Library Banque de ressources SAMPLE POLICY: PROGRAM PLANNING AND EVALUATION MANUAL Sample Community Health Centre Keywords: program planning, evaluation, planning, program evaluation Policy See

More information

SUMMARY RESEARCH REPORT

SUMMARY RESEARCH REPORT NGO Sustainability and Community Development Mechanisms in Armenia SUMMARY RESEARCH REPORT BASED ON EVALUATION OF EPF DOC AND RCCD PROGRAMS Yerevan May 2009 Based on Evaluation of DOC and RCCD Programs

More information

majority, plurality Polls are not perfect Margin of error Margin of error

majority, plurality Polls are not perfect Margin of error Margin of error 349 Polls and surveys Reporting on public opinion research requires rigorous inspection of a poll s methodology, provenance and results. The mere existence of a poll is not enough to make it news. Do not

More information

Advocacy. Self-Assessment Checklist: The Code identifies two key principles on advocacy: Self-Assessment Checklist - Advocacy.

Advocacy. Self-Assessment Checklist: The Code identifies two key principles on advocacy: Self-Assessment Checklist - Advocacy. Self-Assessment Checklist: Advocacy The Code of Good Practice for NGOs Responding to HIV/AIDS (the Code ) defines advocacy as a method and a process of influencing decision-makers and public perceptions

More information

Nonprofit Governance Index Data Report 1 CEO Survey of BoardSource Members

Nonprofit Governance Index Data Report 1 CEO Survey of BoardSource Members Nonprofit Governance Index 2012 Data Report 1 CEO Survey of BoardSource Members Table of Contents Introduction. 3 Methodology. 4 Organizational Characteristics. 6 CEO Characteristics. 7 Board Composition

More information

Training Needs Analysis

Training Needs Analysis OSCE OMIK Curriculum Design Course Curriculum Design Course Unit B1 Training Needs Analysis (Participant Workbook) Training Needs Analysis Aim: The Aim of this unit is to provide an understanding of the

More information

Standard for applying the Principle. Involving Stakeholders DRAFT.

Standard for applying the Principle. Involving Stakeholders DRAFT. V V Standard for applying the Principle Involving Stakeholders DRAFT www.socialvalueint.org Table of Contents Introduction...1 Identifying stakeholders...4 Stakeholder involvement...5 Deciding how many

More information

A Guide to Competencies and Behavior Based Interviewing

A Guide to Competencies and Behavior Based Interviewing A Guide to Competencies and Behavior Based Interviewing 9.14.2015 HR Toolkit http://www.unitedwayofcolliercounty.org/maphr 2015 Competence is the ability of an individual to do a job properly. Job competencies

More information

Competency Map for the Data Science and Analytics-Enabled Graduate

Competency Map for the Data Science and Analytics-Enabled Graduate Competency Map for the Data Science and Analytics-Enabled Graduate Purpose of Competency Map The purpose of this competency map is to identify the specific skills, knowledge, abilities, and attributes

More information

Recruiting and Training Observers. A Field Guide for Election Monitoring Groups

Recruiting and Training Observers. A Field Guide for Election Monitoring Groups Recruiting and Training Observers A Field Guide for Election Monitoring Groups Recruiting and Training Observers This field guide is designed as an easy- reference tool for domestic non- partisan election

More information

Pay for What Performance? Lessons From Firms Using the Role-Based Performance Scale

Pay for What Performance? Lessons From Firms Using the Role-Based Performance Scale Cornell University ILR School DigitalCommons@ILR CAHRS Working Paper Series Center for Advanced Human Resource Studies (CAHRS) November 1997 Pay for What Performance? Lessons From Firms Using the Role-Based

More information

Category 1 Consumer Input & Involvement

Category 1 Consumer Input & Involvement TECHNICAL ASSISTANCE GUIDE COE DEVELOPED CSBG ORGANIZATIONAL STANDARDS Category 1 Consumer Input & Involvement Community Action Partnership 1140 Connecticut Avenue, NW, Suite 1210 Washington, DC 20036

More information

AUDIT Where are we now? ONGOING MEASUREMENT Are we getting there?

AUDIT Where are we now? ONGOING MEASUREMENT Are we getting there? CIPR Skill Guide Internal Communications: Measurement & Evaluation Applying the PRE Cycle to Internal Communications Introduction Measurement & evaluation is at the heart of successful, strategic internal

More information

Wake County Public Schools

Wake County Public Schools Implementation Drivers: Selection Competency Drivers are the activities to develop, improve, and sustain a practitioner s ability to put a program into practice so that families and children can benefit.

More information

World Bank Africa Gender Innovation Lab (GIL) Call for Expressions of Interest: About the World Bank Africa Region Gender Innovation Lab (GIL)

World Bank Africa Gender Innovation Lab (GIL) Call for Expressions of Interest: About the World Bank Africa Region Gender Innovation Lab (GIL) World Bank Africa Gender Innovation Lab (GIL) Call for Expressions of Interest: Seeking NGOs, governments, donors, and private sector firms with interventions designed to improve women s land tenure security

More information

Planning for Election Observation A Field Guide for Election Monitoring Groups

Planning for Election Observation A Field Guide for Election Monitoring Groups Planning for Election Observation A Field Guide for Election Monitoring Groups Planning for Election Observation This field guide is designed as an easy- reference tool for domestic non- partisan election

More information

TOOL 9.4. HR Competency Development Priorities. The RBL Group 3521 N. University Ave, Ste. 100 Provo, UT

TOOL 9.4. HR Competency Development Priorities. The RBL Group 3521 N. University Ave, Ste. 100 Provo, UT TOOL 9.4 HR Competency Development Priorities The RBL Group 3521 N. University Ave, Ste. 100 Provo, UT 84604 801.373.4238 www.hrfromtheoutsidein.com [9.4] HR COMPETENCY DEVELOPMENT PRIORITIES Directions:

More information

TARGET MARKET AND MARKET SEGMENTATION

TARGET MARKET AND MARKET SEGMENTATION TARGET MARKET AND MARKET SEGMENTATION ESSENTIAL QUESTIONS What group of people will most likely attend the Velcro Pygmies concert? How does an entertainment entity identify and communicate with its target

More information

PUBLIC RELATIONS STRATEGIES LEVEL 4 PROJECT

PUBLIC RELATIONS STRATEGIES LEVEL 4 PROJECT PUBLIC RELATIONS STRATEGIES LEVEL 4 PROJECT TABLE OF CONTENTS 2 Introduction 3 Your Assignment 4 Assess Your Skills 5 Competencies 5 Defining Public Relations 6 Branding 6 Creating a Public Relations Strategy

More information

WILLIAMBOURLANDLLC OVERVIEW TEAMS LAWYERS LEADERS MARKETING VIDEO

WILLIAMBOURLANDLLC OVERVIEW TEAMS LAWYERS LEADERS MARKETING VIDEO Practice / Industry Team Strategy Lawyer Coaching Marketing Video Rule number one? It s all about the client! When your services align with the needs of prospective clients, there is opportunity. That

More information

Learning Objectives. Module 7: Data Analysis

Learning Objectives. Module 7: Data Analysis Module 7: Data Analysis 2007. The World Bank Group. All rights reserved. Learning Objectives At the end of this module, participants should understand: basic data analysis concepts the relationship among

More information

Technical Note Integrating Gender in WFP Evaluations

Technical Note Integrating Gender in WFP Evaluations Technical Note Integrating Gender in WFP Evaluations Version September 2017 1. Purpose of this Technical Note 1. This technical note shows how gender equality and empowerment of women (GEEW), hereafter

More information

Taxonomy for Assessing Evaluation Competencies in Extension

Taxonomy for Assessing Evaluation Competencies in Extension August 2012 Volume 50 Number 4 Article Number 4 FEA2 Return to Current Issue Taxonomy for Assessing Evaluation Competencies in Extension Michelle S. Rodgers Associate Dean and Director Cooperative Extension

More information

GSR Management System - A Guide for effective implementation

GSR Management System - A Guide for effective implementation GSR Management System - A Guide for effective implementation 1 Introduction Governments are facing challenges in meeting societal expectations and have an interest in Governmental Social Responsibility

More information

Youth Success RFP Evidence Base for How Learning Happens 3. GHR Connects,

Youth Success RFP Evidence Base for How Learning Happens 3. GHR Connects, IMPORTANT REMINDER: Please reference the Community Investments Overview from the Agency Resources webpage, which includes information about United Way s population focus, Community Vision for Change, and

More information

Handbook Basics of Indicators, Targets and Data Sources. 11/30/2011 Produced by Poverty and Environment Initiative (PEI) Project Tajikistan-Phase 1

Handbook Basics of Indicators, Targets and Data Sources. 11/30/2011 Produced by Poverty and Environment Initiative (PEI) Project Tajikistan-Phase 1 Handbook Basics of Indicators, Targets and Data Sources 11/30/2011 Produced by Poverty and Environment Initiative (PEI) Project Tajikistan-Phase 1 1. HOW WILL WE KNOW IF WE VE GOT THERE? 1.1 Laying the

More information

Terms of Reference Final Evaluation

Terms of Reference Final Evaluation Terms of Reference Final Evaluation Expanding the Table: Empowering Women to Identify Protection Strategies In Sierra Leone s Industrializing Districts Sierra Leone: Bombali, Port Loko and Pujehun Districts

More information

FOUR SOCIAL MEDIA TACTICS EVERY REAL ESTATE AGENT NEEDS

FOUR SOCIAL MEDIA TACTICS EVERY REAL ESTATE AGENT NEEDS FOUR SOCIAL MEDIA TACTICS EVERY REAL ESTATE AGENT NEEDS By SAM Rico BATTISTA AWARD WINNING SOCIAL MEDIA EXPERT About this Book First, let s go over what you can expect. This ebook is going to cover the

More information

REQUEST FOR PROPOSALS

REQUEST FOR PROPOSALS 1.0 RIGHT TO PLAY Right To Play is an international humanitarian organization that uses sport and play to promote holistic development of children and youth in the most disadvantaged areas of the world.

More information

RESPONDING TO THE CHALLENGE

RESPONDING TO THE CHALLENGE RESPONDING TO THE CHALLENGE Re-Imagining the Role of Special Education Local Plan Areas The Local Control Funding Formula (LCFF) creates a fundamental shift in the focus and delivery of public education

More information

Case Story. Building an Adaptive Team for Market Systems Development in Acholi, Uganda Melaku Yirga Mercy Corps

Case Story. Building an Adaptive Team for Market Systems Development in Acholi, Uganda Melaku Yirga Mercy Corps Case Story This Case Story was submitted to the 2016 CLA Case Competition. The competition was open to individuals and organizations affiliated with USAID and gave participants an opportunity to promote

More information

REQUEST FOR PROPOSALS

REQUEST FOR PROPOSALS 1.0 RIGHT TO PLAY *** DEADLINE EXTENDED *** Right To Play is an international humanitarian organization that uses the transformative power of sport and play to promote the holistic development of children

More information

STRENGTHENING CHILD WELFARE SUPERVISION WITH THE TOOLS FOR EXCELLENCE PROGRAM

STRENGTHENING CHILD WELFARE SUPERVISION WITH THE TOOLS FOR EXCELLENCE PROGRAM 1 STRENGTHENING CHILD WELFARE SUPERVISION WITH THE TOOLS FOR EXCELLENCE PROGRAM A Presentation by: Susan Brooks Northern California Training Academy, UC Davis 2 Overview of This Presentation The goal of

More information

REF STRATEGIC FRAMEWORK

REF STRATEGIC FRAMEWORK Page0 REF STRATEGIC FRAMEWORK 2021 2030 ROMA EDUCATION FUND TEREZ KRT. 46, BUDAPEST Page1 1 Introduction... 2 2 Summary of Strategic Framework... 2 2.1 Strategic orientation in a nutshell... 2 2.2 Context

More information

VIDEO-ON-DEMAND. aamp. and the Consumer Advertising Experience. Summary Findings to Date of the Advanced Advertising Media Project

VIDEO-ON-DEMAND. aamp. and the Consumer Advertising Experience. Summary Findings to Date of the Advanced Advertising Media Project VIDEO-ON-DEMAND and the Consumer Advertising Experience Summary Findings to Date of the Advanced Advertising Media Project aamp aamp AAMP MISSION AND MEMBERS The Advanced Advertising Media Project (AAMP)

More information

Manager, Supervisor & CEMA Skill Set Model County of Santa Clara

Manager, Supervisor & CEMA Skill Set Model County of Santa Clara Leads Innovation Leadership Styles Manages Change Models Integrity, Trust & Transparency Strategic Thinking and Planning Manager, Supervisor & CEMA Skill Set Model County of Santa Clara Conflict Management

More information

A Guide to Competencies and Behavior Based Interviewing. HR Toolkit

A Guide to Competencies and Behavior Based Interviewing. HR Toolkit A Guide to Competencies and Behavior Based Interviewing HR Toolkit 2015 Competency models help make transparent the skills an agency needs to be successful. Start by identifying competencies that predict

More information

DEVELOPING AN EVALUATION PLAN

DEVELOPING AN EVALUATION PLAN CHAPTER DEVELOPING AN EVALUATION PLAN 4 As program staff start planning for program development and begin addressing the components of the logic model, they should keep in mind that as the logic model

More information

Search Committee Process

Search Committee Process Search Committee Process 1. Obtain the committee s charge from the hiring official. Clarify issues such as: Role of the committee: selection of candidate or recommending finalists Budget Timeframe 2. Review

More information

The Impact of Corporate Social Responsibility on Consumers Attitudes at Northwestern Mutual: A Case Study

The Impact of Corporate Social Responsibility on Consumers Attitudes at Northwestern Mutual: A Case Study The Impact of Corporate Social Responsibility on Consumers Attitudes at Northwestern Mutual: Researchers: Kuhlman, Laura Lett, Kate Vornhagen, Shellie December 6, 2013 Marketing Research Kuhlman_A7 Executive

More information

Impact Evaluation Matters: Enhanced Learning Through Involving Stakeholders in Oxfam s Impact Studies evaluation Matters

Impact Evaluation Matters: Enhanced Learning Through Involving Stakeholders in Oxfam s Impact Studies evaluation Matters Peter Huisman, Rik Linssen, Anne Oudes Impact Evaluation Matters: Enhanced Learning Through Involving Stakeholders in Oxfam s Impact Studies evaluation Matters The World Citizens Panel (WCP) has been developed

More information

1 P a g e MAKING IT STICK. A guide to embedding evaluation

1 P a g e MAKING IT STICK. A guide to embedding evaluation 1 P a g e MAKING IT STICK A guide to embedding evaluation Table of Contents Page 3 Page 4 Page 5 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 12 Page 15 Page 16 About this guide Why embed? How to use

More information

HOW TO DEVELOP A STRONG PROJECT DESIGN. User Guide #9

HOW TO DEVELOP A STRONG PROJECT DESIGN. User Guide #9 HOW TO DEVELOP A STRONG PROJECT DESIGN User Guide #9 1 This User Guide is intended to help applicants develop a strong project design as applicants prepare AmplifyChange grant applications. Readers are

More information

TECHNICAL NOTE. The Logical Framework

TECHNICAL NOTE. The Logical Framework NUMBER 2 VERSION 1.0 DEC 2012 Planning Series This document describes the rationale, theory, and essential elements of the LogFrame as it relates to USAID s Program Cycle Technical Notes are published

More information

Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland Guidance and Tips for new users March 2009

Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland Guidance and Tips for new users March 2009 Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland Guidance and Tips for new users March 2009 Volunteer Development Scotland, 2008 Contents INTRODUCTION 1 WHY SHOULD WE ASSESS

More information

Designing Volunteer Role Descriptions

Designing Volunteer Role Descriptions Designing Volunteer Role Descriptions A guide for Volunteer Managers to help design quality volunteer experiences within your team About Volunteer Role Descriptions Defining and managing the work of volunteers

More information

Building government capacity to implement market-based WASH

Building government capacity to implement market-based WASH Tactic Report Building government capacity to implement market-based WASH An innovative approach to training delivers scalable results BACKGROUND Vietnam has made great progress increasing WASH access

More information

Saville Consulting Assessment Suite

Saville Consulting Assessment Suite Saville Consulting Assessment Suite www.peoplecentric.co.nz info@peoplecentric.co.nz +64 9 963 5020 Overview Swift Aptitude Assessments (IA& SA)... 3 Analysis Aptitudes (IA)... 4 Professional Aptitudes

More information

Deputy Manager (Complex Needs) Islington Mental Health Services. Frontline Staff, Volunteers & Peer Mentors

Deputy Manager (Complex Needs) Islington Mental Health Services. Frontline Staff, Volunteers & Peer Mentors Post: Deputy Manager (Complex Needs) Delegated Authority Level 6 Team: Responsible to: Responsible for: Islington Mental Health Services Service Manager Frontline Staff, Volunteers & Peer Mentors Job Purpose

More information

Page 1 of 5 The director's role in strategy development. Article from: Directors & Boards Article date: September 22, 2000 Author: STEINBERG, RICHARD M. ; BROMILOW, CATHERINE L. Despite inherent difficulties

More information

Contract Award Procures a Promise; Supplier Performance Reviews Help Deliver the Promise

Contract Award Procures a Promise; Supplier Performance Reviews Help Deliver the Promise White Paper Contract Award Procures a Promise; Supplier Performance Reviews Help Deliver the Promise Peter Murrison, Supplier Performance Consultant Commerce Decisions Limited Executive summary Why customer-supplier

More information

Dimensions of the Role: Budget/ Asset Management The position holder will manage the annual communications budget of approximately BDT... only.

Dimensions of the Role: Budget/ Asset Management The position holder will manage the annual communications budget of approximately BDT... only. Job Description Position Communications Specialist-Aparajita Project Grade D1 Department & Location Reports to (position): Purpose: How does this post support Plan s strategy and mission? Bangladesh Country

More information

Design, Monitoring, & Evaluation Module

Design, Monitoring, & Evaluation Module CASP: THE COMMON APPROACH TO SPONSORSHIP-FUNDED PROGRAMMING Design, Monitoring, & Evaluation Module {Revised April 2007} This module presents the framework and tools for designing, monitoring and evaluating

More information

2016 Staff Climate Survey Results. Division of Marketing and Communication Report

2016 Staff Climate Survey Results. Division of Marketing and Communication Report Staff Climate Survey Results Division of Marketing and Communication Report In May, all, staff members were invited to participate in a Staff Climate Survey by the s Human Resources Department. Usable

More information

IPLAC CONFLICT ANALYSIS GUIDE

IPLAC CONFLICT ANALYSIS GUIDE IPLAC CONFLICT ANALYSIS GUIDE 2007 IPLAC Conflict Analysis Guide 2007 1 of 10 Conflict analysis can be understood as the systematic study of the profile, causes, actors, and dynamics of conflict. (IDRC

More information

Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized

Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Impact Assessment Report Fiscal Year 2003 37489 Introduction... 3 Methodology... 3 Figure

More information

Human Rights and Digitalization Project in Tanzania. Baseline Survey. Terms of Reference. August, 2017

Human Rights and Digitalization Project in Tanzania. Baseline Survey. Terms of Reference. August, 2017 Human Rights and Digitalization Project in Tanzania Baseline Survey Terms of Reference August, 2017 Introduction Oxfam Country Strategy, 2015 2019, has a vision of an inclusive, equitable and just Tanzanian

More information

Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland. Guidance and Tips for new users March 2009

Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland. Guidance and Tips for new users March 2009 Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland Guidance and Tips for new users March 2009 Contents INTRODUCTION 2 WHY SHOULD WE ASSESS THE IMPACT OF VOLUNTEERING? 3 WILL IT

More information

The practice of constantly assessing personal knowledge and skills and following paths for improvement. have been done by

The practice of constantly assessing personal knowledge and skills and following paths for improvement. have been done by Selection Process Rubric PERSONAL DOMAIN Personal competencies refer to the qualities, characteristics and attitudes necessary to achieve personal and system goals Continuous Growth The practice of constantly

More information

QUICK GUIDE TO INTEGRATING PUBLIC-PRIVATE DIALOGUE SCOPING MISSION. Investment Climate l World Bank Group. In partnership with

QUICK GUIDE TO INTEGRATING PUBLIC-PRIVATE DIALOGUE SCOPING MISSION. Investment Climate l World Bank Group. In partnership with QUICK GUIDE TO INTEGRATING PUBLIC-PRIVATE DIALOGUE SCOPING MISSION Investment Climate l World Bank Group In partnership with 1 SCOPING MISSION This Quick Guide will help with: Diagnosing the potential

More information

C-18: Checklist for Assessing USAID Evaluation Reports

C-18: Checklist for Assessing USAID Evaluation Reports C-18: Checklist for Assessing USAID Evaluation Reports Checklist for Reviewing Evaluation Reports High quality, evidence-based evaluation reports with a clear focus on decision-making for USAID and other

More information

Guidance Note 3 Introduction to Mixed Methods in Impact Evaluation. Michael Bamberger Independent Consultant

Guidance Note 3 Introduction to Mixed Methods in Impact Evaluation. Michael Bamberger Independent Consultant Guidance Note 3 Introduction to Mixed Methods in Impact Evaluation Michael Bamberger Independent Consultant Outline 2 A. Why mixed methods (MM)? B. Four decisions for designing a MM evaluation C. Using

More information

New Vision for Agriculture Country Partnership Guide (CPG) Toolkit Secretariat Structures

New Vision for Agriculture Country Partnership Guide (CPG) Toolkit Secretariat Structures New Vision for Agriculture Country Partnership Guide (CPG) Toolkit Secretariat Structures For more information please contact: Tania Strauss, Head, New Vision for Agriculture Initiative tania.strauss@weforum.org

More information

1 Survey of Cohort Mentors: Gender-Based Analyses August 2013

1 Survey of Cohort Mentors: Gender-Based Analyses August 2013 1 Survey of Cohort Mentors: Gender-Based Analyses August 2013 Sample Seventeen mentors completed the survey from an overall population sample of 32 mentors. Thus, this survey had a response rate of 53.1%.

More information

Education Liaison: The Performance Evaluation Process (PEP)

Education Liaison: The Performance Evaluation Process (PEP) Education Liaison: The Performance Evaluation Process (PEP) The Performance Evaluation Process (PEP) for an Education Liaison is intended to provide an employee with valuable insight into their job performance,

More information

2016 Staff Climate Survey Results. College of Agriculture and Life Sciences Report

2016 Staff Climate Survey Results. College of Agriculture and Life Sciences Report Staff Climate Survey Results College of Agriculture and Life Sciences Report In May, all, staff members were invited to participate in a Staff Climate Survey by the s Human Resources Department. Usable

More information

Positions required: A team of consultancy with one Lead Consultant and one Qualitative Research Assistant

Positions required: A team of consultancy with one Lead Consultant and one Qualitative Research Assistant Terms of Reference for Gender Transformative and Responsible Agribusiness Investments in South East Asia II (GRAISEA 2) Baseline Study in Vietnam: Farmer Survey and Focus Group Discussion Positions required:

More information

2016 Staff Climate Survey Results. VP of Research Report

2016 Staff Climate Survey Results. VP of Research Report Staff Climate Survey Results VP of Research Report In May, all, staff members were invited to participate in a Staff Climate Survey by the s Human Resources Department. Usable responses were gathered from,

More information

Doomed from the Start?

Doomed from the Start? Doomed from the Start? Why a Majority of Business and IT Teams Anticipate Their Software Development Projects Will Fail Winter 2010/2011 Industry Survey 2011. Geneca LLC. All Rights Reserved. Why a Majority

More information

International Program for Development Evaluation Training (IPDET)

International Program for Development Evaluation Training (IPDET) The World Bank Group Carleton University IOB/Ministry of Foreign Affairs, Netherlands International Program for Development Evaluation Training (IPDET) Building Skills to Evaluate Development Interventions

More information

COMMUNITY HEALTH SCIENCES

COMMUNITY HEALTH SCIENCES COMMUNITY HEALTH SCIENCES In addition to the school-wide competencies, for students pursuing the MPH degree in Community Health Sciences, the following competencies apply: Design and develop approaches

More information

Performance Skills Leader. Individual Feedback Report

Performance Skills Leader. Individual Feedback Report Performance Skills Leader Individual Feedback Report Jon Sample Date Printed: /6/ Introduction REPORT OVERVIEW Recently, you completed the PS Leader assessment. You may recall that you were asked to provide

More information

2016 Staff Climate Survey Results. University Libraries Report

2016 Staff Climate Survey Results. University Libraries Report Staff Climate Survey Results University Libraries Report In May, all, staff members were invited to participate in a Staff Climate Survey by the s Human Resources Department. Usable responses were gathered

More information

LEARNING BRIEF I. INTRODUCTION PREPARED BY: KMSS, CARITAS AUSTRALIA, CAFOD, CRS, TRÓCAIRE

LEARNING BRIEF I. INTRODUCTION PREPARED BY: KMSS, CARITAS AUSTRALIA, CAFOD, CRS, TRÓCAIRE LEARNING BRIEF Karuna Mission Social Solidarity: How collaboration among Caritas Internationalis member organizations led to improved institutional development and capacity strengthening PREPARED BY: KMSS,

More information

PUBLIC RELATIONS Guide for RE/MAX Offices and Agents

PUBLIC RELATIONS Guide for RE/MAX Offices and Agents PUBLIC RELATIONS Guide for RE/MAX Offices and Agents WHAT EXACTLY IS PUBLIC RELATIONS? PUB LIC RE LA TIONS The name sums it up. Simply put, Public Relations (PR) is the management of the communication

More information

Managing Change. By Ann McDonald. to the success of a business. Companies most likely to be successful in making change work to their

Managing Change. By Ann McDonald. to the success of a business. Companies most likely to be successful in making change work to their Managing Change By Ann McDonald The one consistency in the travel industry is change and being able to manage change is crucial to the success of a business. Companies most likely to be successful in making

More information

Retaining Employers for Work Placement Students

Retaining Employers for Work Placement Students Retaining Employers for Work Placement Students Action Research Project by Jennifer Boyce Introduction The purpose of this research is to establish why employers very rarely continue to participate in

More information

Comprehensive Organizational Health Assessment

Comprehensive Organizational Health Assessment Comprehensive Organizational Health Assessment Presented by: Robin Leake, Ph.D Director of Research and Evaluation Butler Institute for Families Paul Fritzler District Manager Department of Family Services,

More information

2016 Staff Climate Survey Results. College of Veterinary Medicine and Biomedical Sciences Report

2016 Staff Climate Survey Results. College of Veterinary Medicine and Biomedical Sciences Report Staff Climate Survey Results College of Veterinary Medicine and Biomedical Sciences Report In May, all, staff members were invited to participate in a Staff Climate Survey by the s Human Resources Department.

More information

Subject: Request for Quotations for: Nationwide household survey in Libya

Subject: Request for Quotations for: Nationwide household survey in Libya Date: January 30, 2019 Ref.: RFQ/19/032 Subject: Request for Quotations for: Nationwide household survey in Libya The International Foundation for Electoral Systems (IFES) invites your firm to participate

More information

Getting it Right: Promising Practices for Financial Capability Programs

Getting it Right: Promising Practices for Financial Capability Programs Getting it Right: Promising Practices for Financial Capability Programs $ A learning series from the Financial Capability Demonstration Project THE FINANCIAL CAPABILITY APPROACH In the post-recession economy,

More information

Impact Evaluation. Some Highlights from The Toolkit For The Evaluation of Financial Capability Programs in LMIC

Impact Evaluation. Some Highlights from The Toolkit For The Evaluation of Financial Capability Programs in LMIC Impact Evaluation Some Highlights from The Toolkit For The Evaluation of Financial Capability Programs in LMIC World Bank Dissemination Workshop, New Delhi March 2013 What is the Toolkit? Russia Trust

More information

UNIVERSITY OF TEXAS AT AUSTIN EMPLOYEE ENGAGEMENT GUIDE

UNIVERSITY OF TEXAS AT AUSTIN EMPLOYEE ENGAGEMENT GUIDE UNIVERSITY OF TEXAS AT AUSTIN EMPLOYEE ENGAGEMENT GUIDE WHAT IS THE SURVEY OF EMPLOYEE ENGAGEMENT? The Survey of Employee Engagement facilitated by the UT Austin Institute for Organizational Excellence,

More information

UVM Extension Workforce Diversity Recruiting and Retention Plan: 2012

UVM Extension Workforce Diversity Recruiting and Retention Plan: 2012 UVM Extension Workforce Diversity Recruiting and Retention Plan: 2012 Summary All Search Committee Chairs and members are provided a copy of this document and the University Diversity Workforce Recruitment

More information

Support Material 4.8b. MODULE 4.8 Developing Policy for Early Childhood. BASIC TEXT Background Reading for the Facilitator

Support Material 4.8b. MODULE 4.8 Developing Policy for Early Childhood. BASIC TEXT Background Reading for the Facilitator Support Material 4.8b MODULE 4.8 Developing Policy for Early Childhood BASIC TEXT Background Reading for the Facilitator Steps in Policy Development No matter what the impetus for change, policy-making

More information

Who Are My Best Customers?

Who Are My Best Customers? Technical report Who Are My Best Customers? Using SPSS to get greater value from your customer database Table of contents Introduction..............................................................2 Exploring

More information