For More Information

Size: px
Start display at page:

Download "For More Information"

Transcription

1 CHILDREN AND FAMILIES EDUCATION AND THE ARTS ENERGY AND ENVIRONMENT HEALTH AND HEALTH CARE INFRASTRUCTURE AND TRANSPORTATION INTERNATIONAL AFFAIRS LAW AND BUSINESS NATIONAL SECURITY POPULATION AND AGING PUBLIC SAFETY SCIENCE AND TECHNOLOGY TERRORISM AND HOMELAND SECURITY The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. This electronic document was made available from as a public service of the RAND Corporation. Skip all front matter: Jump to Page 16 Support RAND Purchase this document Browse Reports & Bookstore Make a charitable contribution For More Information Visit RAND at Explore the RAND National Defense Research Institute View document details Limited Electronic Distribution Rights This document and trademark(s) contained herein are protected by law as indicated in a notice appearing later in this work. This electronic representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of RAND electronic documents to a non-rand website is prohibited. RAND electronic documents are protected under copyright law. Permission is required from RAND to reproduce, or reuse in another form, any of our research documents for commercial use. For information on reprint and linking permissions, please see RAND Permissions.

2 This product is part of the RAND Corporation technical report series. Reports may include research findings on a specific topic that is limited in scope; present discussions of the methodology employed in research; provide literature reviews, survey instruments, modeling exercises, guidelines for practitioners and research professionals, and supporting documentation; or deliver preliminary findings. All RAND reports undergo rigorous peer review to ensure that they meet high standards for research quality and objectivity.

3 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? A Framework to Assess the Global Train and Equip 1206 Program Jennifer D. P. Moroney, Beth Grill, Joe Hogler, Lianne Kennedy-Boudali, Christopher Paul Prepared for the Office of the Secretary of Defense Approved for public release; distribution unlimited NATIONAL DEFENSE RESEARCH INSTITUTE

4 The research described in this report was prepared for the Office of the Secretary of Defense (OSD). The research was conducted within the RAND National Defense Research Institute, a federally funded research and development center sponsored by OSD, the Joint Staff, the Unified Combatant Commands, the Navy, the Marine Corps, the defense agencies, and the defense Intelligence Community under Contract W74V8H-06-C Library of Congress Cataloging-in-Publication Data How successful are U.S. efforts to build capacity in developing countries? : a framework to assess the Global Train and Equip "1206" Program / Jennifer D. P. Moroney... [et al.]. p. cm. Includes bibliographical references. ISBN (pbk. : alk. paper) 1. Security Assistance Program Evaluation. 2. Military assistance, American Foreign countries Evaluation. 3. United States Armed Forces Stability operations Evaluation. 4. Terrorism Prevention International cooperation Evaluation. 5. Combined operations (Military science) Evaluation. 6. United States Military relations Foreign countries. 7. United States Military policy Evaluation. I. Moroney, Jennifer D. P., II. International Security and Defense Policy Center. III. Rand Corporation. IV. United States. Office of the Under Secretary of Defense for Policy. V. Title: Framework to assess the Global Train and Equip "1206" Program. UA12.H ' dc The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND s publications do not necessarily reflect the opinions of its research clients and sponsors. R is a registered trademark. Copyright 2011 RAND Corporation Permission is given to duplicate this document for personal use only, as long as it is unaltered and complete. Copies may not be duplicated for commercial purposes. Unauthorized posting of RAND documents to a non-rand website is prohibited. RAND documents are protected under copyright law. For information on reprint and linking permissions, please visit the RAND permissions page ( permissions.html). Published 2011 by the RAND Corporation 1776 Main Street, P.O. Box 2138, Santa Monica, CA South Hayes Street, Arlington, VA Fifth Avenue, Suite 600, Pittsburgh, PA RAND URL: To order RAND documents or to obtain additional information, contact Distribution Services: Telephone: (310) ; Fax: (310) ; order@rand.org

5 Preface The U.S. government has long worked with allies and partners in a security cooperation context, providing various means through which to build their capacities to counter threats. Yet, it is difficult to comprehensively assess how these activities contribute to U.S. objectives. Assessing the effect of security cooperation efforts is extremely important. Security cooperation assessments support informed decisionmaking at the policy, program manager, and execution levels, and they provide stakeholders with the necessary tools to determine which aspects of these investments are most productive and which areas require refinement. This report provides a framework for thinking about, planning for, and implementing security cooperation assessments for the Global Train and Equip 1206 Program, managed by the Office of the Under Secretary of Defense for Policy. The program has its origins in Section 1206 of the National Defense Authorization Act for Fiscal Year 2006, which authorizes the Secretary of Defense, with the concurrence of the Secretary of State, to spend up to $350 million each year to train and equip foreign military and nonmilitary maritime forces to conduct counterterrorism operations and to build the capacity of a foreign nation s military forces to enable it to participate in or support military and stability operations in which U.S. armed forces are involved. This report documents research performed between July 2010 and January 2011 for a study titled Identifying Resources to Implement an Assessment Framework for the Global Train and Equip 1206 Program. The findings should be of interest to agencies and to individuals and offices in the U.S. Department of Defense and U.S. Department of State with roles in guiding and overseeing international security cooperation programs and activities, as well as to those who are directly involved with the planning and implementation of these efforts. This research was sponsored by the Office of the Under Secretary of Defense for Policy and conducted within the International Security and Defense Policy Center of the RAND National Defense Research Institute, a federally funded research and development center sponsored by the Office of the Secretary of Defense, the Joint Staff, the Unified Combatant Commands, the Navy, the Marine Corps, the defense agencies, and the defense Intelligence Community. For more information on the International Security and Defense Policy Center, see or contact the director (contact information is provided on the web page). iii

6

7 Contents Preface... iii Figures... ix Tables... xi Summary...xiii Acknowledgments... xix Abbreviations... xxi CHAPTER ONE Introduction... 1 The Global Train and Equip 1206 Program: An Overview... 2 The Importance of Assessments... 3 Study Objectives and Analytical Approach... 4 Organization of This Report... 6 CHAPTER TWO RAND s Assessment Framework... 7 What Is Assessment?... 7 Why Assess?... 7 Challenges to 1206 Program Assessment... 8 Determining Causality... 8 Well-Articulated Intermediate Goals to Inform Decisionmaking... 8 Assessment Capabilities of 1206 Program Stakeholders... 9 Multiplicity of and Differing Priorities of Stakeholders... 9 Security Cooperation Data Tracking Systems Are Not Currently Organized for 1206 Program Assessment... 9 Delegating Assessment Responsibilities...10 Expectations and Preconceived Notions of Assessment...10 Levels of Assessment: The Hierarchy of Evaluation...10 Level 1: Assessment of Need for the Program...11 Level 2: Assessment of Design and Theory...12 Level 3: Assessment of Process and Implementation...13 Level 4: Assessment of Outcome and Impact...13 Level 5: Assessment of Cost-Effectiveness...14 Hierarchy and Nesting...14 v

8 vi How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Key Insights and the Assessment Framework...15 Assessment and Objectives...15 Assessment Roles and Responsibilities...15 Data Collection and Reporting...17 Conclusion...17 CHAPTER THREE Exploring the Options for 1206 Program Assessment: Discussions with Key Stakeholders...19 Stakeholder Interviews...19 Stakeholder Perceptions of the 1206 Program Short Timeline Allows Faster Impact but Creates Administrative Difficulties Lack of Funding for Maintenance and Support Creates Concerns About Sustainability...21 The Project Design, Selection, Prioritization, and Assessment Processes Are Somewhat Unclear Interagency Cooperation Has Improved, but Data Sharing Is Limited Stakeholder Observations Relative to Insights of Assessment Guidance on Assessment Is Needed Measurable Objectives Are Required on Multiple Levels Program Assessment Roles and Responsibilities Will Require Clarification...25 New Data Collection and Reporting Requirements Are Needed Improving Coordination with Key Agencies Is Critical Conclusion CHAPTER FOUR Applying the Framework to the 1206 Program: Results of the Survey Survey of 1206 Program Experts Constructing the Survey Program Guidance Survey Respondents Findings and Observations...31 Formal Guidance on the Assessment Process Is Needed...31 Measurable Objectives That Explicitly Connect to Broader U.S. Government, Theater, Regional, and 1206 Project Goals Are Lacking...32 Gaps Exist in Data Collection and Reporting Requirements Program Assessment Roles and Responsibilities Are Currently Unclear Limited Insight into Other DoD Programs May Hinder Cost-Effectiveness Assessments Respondents Were Not Always Aware of Guidance...37 Stakeholders Create Their Own Guidance in the Absence of Formal Guidance...37 Assessment Skills Exist but Could Be Improved...37 Conclusions CHAPTER FIVE Conclusions and Recommendations...39 Findings Recommendations...41 Recommendation 1: Develop an Implementation Plan for 1206 Program Assessments...41

9 Contents vii Recommendation 2: Develop and Disseminate Life-Cycle Guidance for Project Development, Implementation, and Assessment...41 Recommendation 3: Ensure That 1206 Program Stakeholders Understand Their Assessment Roles...41 Recommendation 4: Establish a Process for Setting Objectives at the Project and Activity Levels...41 Recommendation 5: Systematically Collect Data Regarding Achievement of Objectives and Consider Using Focus Groups to Develop Metrics Recommendation 6: Identify Data Collectors Based on Their Proximity to the Action Recommendation 7: Host an Annual Conference of 1206 Program Stakeholders at the Policymaking Level Recommendation 8: Consider Developing an Automated Tool to Facilitate the Collection and Reporting of Assessment Data Recommendation 9: Consider Hiring Outside Support or Appointing Career Staff to Be Program Representatives at Each COCOM The Implemention Plan Track 1: Near-Term Actions Track 2: Long-Term Actions...45 Rollout of the Assessment Implementation Plan Conclusion APPENDIXES A. The Assessment Survey...47 B. Focused Discussions...65 Bibliography...67

10

11 Figures S.1. Five Levels: The Assessment Hierarchy...xv S.2. Track 1 of the Assessment Implementation Plan... xvii S.3. Track 2 of the Assessment Implementation Plan... xviii S.4. Proposed Rollout for the Assessment Implementation Plan... xviii 1.1. Five Levels: The Assessment Hierarchy Data Sources Identified for Process and Implementation Assessments Data Sources Identified for Outcome Assessments...35 ix

12

13 Tables 4.1. Types of 1206 Program Data That Can Be Collected, by Topic Disconnects Between Setting Objectives and Program Design Disconnects Between Setting Objectives and Collecting Data on Their Achievement Types of Assessment Data Collected by Survey Respondents...33 xi

14

15 Summary The U.S. government has long worked with allies and partners to build their capacities to counter threats through various means, including training, equipping, and exercising, as well as through relationship-building activities such as workshops and conferences, staff talks, and education. Yet, it is challenging to comprehensively assess exactly how these activities have contributed to U.S. objectives. Security cooperation activities are long-term and geographically dispersed, and there is currently no comprehensive framework for assessing these programs. Assessing the impact of security cooperation efforts is inherently difficult but extremely important. In short, security cooperation assessments support informed decisionmaking at the policy, program manager, and execution levels, and they provide stakeholders at all levels of government with effective tools to determine which aspects of these investments are most productive and which areas require refinement. Those who plan and execute security cooperation efforts intuitively know whether their individual programs have successfully gained ground with their respective partner nations. At the most basic level, officials assert that the U.S. partner nation relationship is simply better than it was prior to executing the activity. These assertions are difficult to validate empirically, however. At present, assessments of security cooperation programs, if they are done at all, are largely conducted by the same organizations that executed the activities. Thus, these assessments, no matter how carefully carried out, are subject to concerns about bias on the part of the assessors. Objective assessments, when available, provide valuable data on which meaningful discussions about program funding can be grounded. This report provides a framework for thinking about, planning for, and implementing security cooperation assessments for the 1206 Program managed by Office of the Under Secretary of Defense for Policy (OUSD[P]). It argues that such assessments provide the necessary justification for continuing, expanding, altering, or cutting back on existing programs. Without insight into the successes and weaknesses of security cooperation programs, it is impossible to make informed decisions about how best to allocate the resources available for their use. The Global Train and Equip 1206 Program The National Defense Authorization Act for Fiscal Year 2006 established the authority for the 1206 Program. Section 1206 of that legislation authorizes the Secretary of Defense, with the concurrence of the Secretary of State, to spend up to $350 million each year to train and equip foreign military and nonmilitary maritime forces to conduct counterterrorism opera- xiii

16 xiv How Successful Are U.S. Efforts to Build Capacity in Developing Countries? tions and to enable the foreign partner to participate in or support military and stability operations in which U.S. armed forces are involved. This authority is set to expire at the end of fiscal year 2011, but Congress may renew Section 1206, as it has in the past. The 1206 Program enables the U.S. Department of Defense (DoD) to conduct capacitybuilding activities focused on counterterrorism and stability operations with foreign military partners. By law, Section 1206 authority has two discrete uses: to build the capacity of a foreign country s national military forces to enable that country to conduct counterterrorism operations or participate in or support military and stability operations in which U.S. armed forces are involved to build the capacity of a foreign country s maritime security forces to conduct counterterrorism operations. Study Approach RAND was asked by the Office of the Assistant Secretary of Defense for Special Operations/ Low-Intensity Conflict and Interdependent Capabilities (OASD[SO/LIC&IC]) in OUSD(P) for assistance in identifying key stakeholders, their roles, and sources of data in support of a comprehensive assessment of the 1206 Program. RAND was also asked to develop the key elements of an implementation plan that would allow the repeated assessment of the program s outcomes and cost-effectiveness. The RAND study team adopted a multistep approach to achieving these study objectives. We first focused on identifying current roles, data sources, and ongoing assessment processes through a series of discussions with key policymakers, legislators, and project implementers in the field. We then developed and deployed a survey designed to gather information on the roles, processes, and responsibilities of stakeholder organizations in the 1206 Program, as well as to elicit information regarding assessment guidance and skills. Next, we analyzed the survey results to determine which stakeholders were best suited for the collection of data in support of 1206 Program assessments and which stakeholders could potentially conduct such assessments. We then combined our findings from the survey analysis and the interviews and presented them to the sponsor. Based on the survey findings, we developed recommendations and key elements of an assessment implementation plan. This report lays the groundwork for a comprehensive assessment of the 1206 Program rather than for various snapshot-in-time assessments of specific 1206 projects around the world. It takes a longer-term view, with the understanding that accurate assessments, especially those focused on outcomes and cost-effectiveness, require time and effort. Assessment Framework We based our analysis of the 1206 Program on five assessment levels: Level 1: need for the program Level 2: design and theory Level 3: process and implementation

17 Summary xv Level 4: outcome and impact Level 5: cost-effectiveness (relative to other, similar programs). 1 We think of these five levels, integral to the assessment framework, as a hierarchy, depicted graphically in Figure S.1. In this hierarchy, a positive assessment at a higher level relies on positive assessments at the lower levels. By positive, we mean that the assessment reveals that associated objectives are being met. Accordingly, problems at a higher level of the hierarchy link to problems at lower levels of the hierarchy. For example, if a cost-effectiveness assessment reveals a problem, one can examine information from the lower levels of the hierarchy to fully understand the root cause of that problem. Interviews and Survey Approach and Findings Chapters Three and Four discuss the approach and findings from our research effort, which included 14 interviews with 1206 Program stakeholders in DoD and the U.S. Department of State (DoS) and staff from key congressional committees. Interview questions helped the team identify current roles in the 1206 Program, as well as data sources and existing assessment processes for the program. We also conducted an online survey of stakeholder representatives to gain specific insights into current and potential assessment capabilities. Fifty-six of the 136 individuals asked to take the survey responded, which is approximately 40 percent. Survey questions were grouped Figure S.1 Five Levels: The Assessment Hierarchy Level 5 Level 4 Level 3 Level 2 Assessment of cost-effectiveness Assessment of outcome/impact Assessment of process and implementation Assessment of design and theory Level 1 Assessment of need for the program SOURCE: Adapted from Peter H. Rossi, Mark W. Lipsey, and Howard E. Freeman, Evaluation: A Systematic Approach, 7th ed., Thousand Oaks, Calif.: Sage Publications, 2004, Exhibit 3-C. Used with permission. NOTE: For a detailed discussion of the assessment hierarchy, see Christopher Paul, Harry J. Thie, Elaine Reardon, Deanna Weber Prine, and Laurence Smallman, Implementing and Evaluating an Innovative Approach to Simulation Training Acquisition, Santa Monica, Calif.: RAND Corporation, MG-442-OSD, RAND TR1121-S.1 1 The study did not include a cost-effectiveness assessment, which is a measure of relative benefit based on cost and requires comparison with similar programs. We did not have access to the required budgetary information to carry out such an analysis.

18 xvi How Successful Are U.S. Efforts to Build Capacity in Developing Countries? into four broad areas derived from the assessment hierarchy presented in Figure S.1: process implementation, process design and development, program recommendations, and program decisions. The team was able to connect each respondent s answers to possible assessment roles. The interviews and survey results clearly indicate that the 1206 Program stakeholder community is in favor of instituting an assessment framework. Such a framework should include specific guidance on assessments, measureable objectives at the project level (in particular), and data collection and data reporting processes. It is clear from the results that much data are currently being collected, but they are not reaching policymakers in a systematic way. Interview participants also mentioned the need to improve coordination between DoD and DoS at all levels. Key Findings, Recommendations, and the Assessment Implementation Plan This report offers five key findings and nine recommendations, as well as some thoughts on an assessment implementation plan. Findings Our analysis of the survey data revealed the following key findings: There is a lack of formal guidance on the assessment process for the 1206 Program. Such guidance would help ensure that all 1206 stakeholders understand the importance of assessing the program in a comprehensive way. Measurable objectives that explicitly connect to broader U.S. government, theater, regional, and 1206 Program goals are currently lacking for the 1206 Program. Gaps seem to exist in the data collection and reporting requirements process. Data are not reaching stakeholders charged with conducting assessments. Assessment roles and responsibilities for 1206 Program stakeholders are currently unclear and unassigned. Coordination among key stakeholders, in both DoD and other agencies namely, DoS and key congressional committees could be improved. Recommendations Our analysis of the survey data resulted in the following recommendations: Develop an implementation plan for 1206 Program assessments. Develop and disseminate life-cycle guidance for project development, implementation, and assessment. Ensure that 1206 Program stakeholders understand their assessment roles. Establish a process for setting objectives at the project and activity levels. Systematically collect data on the achievement of objectives. Consider using focus groups to develop metrics. Identify data collectors based on their proximity to the action. Host an annual conference of 1206 Program stakeholders at the policymaking level.

19 Summary xvii Consider developing an automated tool to facilitate the collection and reporting of assessment data. Consider hiring outside support or appointing career staff members to be program representatives for the 1206 Program at each of the respective combatant commands (COCOMs). Assessment Implementation Plan Our assessment implementation plan is divided into two tracks (near-term actions and longerterm actions), but several of the activities can be undertaken simultaneously as resources and timing allow. Track 1, near-term actions, detailed in Figure S.2, offers relatively low-cost steps that can be implemented in the short term. In contrast, track 2 comprises some longer-term, potentially more costly and timeconsuming steps, as shown in Figure S.3. A proposed rollout plan is presented in Figure S.4. The rollout plan provides notional timelines for instituting track 1 and track 2 of the assessment implementation plan. Phase 1 of the rollout focuses on internal changes within OUSD(P) that can be accomplished relatively quickly. Phase 2 involves other stakeholders in DoD and DoS and outreach to key congressional committees; it centers on a key meeting with senior representatives from stakeholder organizations. Phase 3 of the rollout involves the execution of the implementation plan. Figure S.2 Track 1 of the Assessment Implementation Plan 1. Provide formal guidance on assessment Identify sources of objectives and who sets those objectives Describe the assessment process, step-by-step Identify timelines for conducting assessments 2. Establish 1206 assessment roles and responsibilities Identify specific roles within the program Consider specialized training Obtain partner-nation input and feedback 3. Set data collection and reporting requirements Identify specific offices responsible for collecting certain types of data, based on functions currently performed Ensure that stakeholders know which data to collect Establish routine reporting requirements with standardized formats and timelines Avoid redundancy, unnecessary data collection, etc., recognizing that this is an additional duty 4. Improve coordination with key agencies Seek input from all stakeholders early on Internal: OUSD(P), DSCA, services, COCOMs External: DoS, congressional staffs, partner nations NOTE: DSCA = Defense Security Cooperation Agency. RAND TR1121-S.2

20 xviii How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Figure S.3 Track 2 of the Assessment Implementation Plan 1. Set measurable objectives that explicitly connect to broader U.S. government, theater, regional, and 1206 Program goals Consider both the overall program level and the project level Explicitly connect the projects to counterterrorism or building partnership capacity goals or both Ensure connections to higher-level 1206 Program guidance Define the process for setting objectives Consider drawing from the U.S. Pacific Command model Ensure that objectives are clearly measurable Focus on the longer-term outcomes that cannot be measured in one year 2. Refine roles for data collection and analysis based on track 1 lessons Institute a consistent but flexible process 3. Implement an automated system for data collection and assessment follow-on from current RAND study (the survey instrument) Identify potential off-the-shelf automated tools Automated tool should be modular to include results of multiple projects Consider asking the Office of the Deputy Assistant Secretary of Defense for Partnership Strategy and Stability Operations to develop such an automated tool Conduct a pilot test of the automated data collection tool RAND TR1121-S.3 Figure S.4 Proposed Rollout for the Assessment Implementation Plan Phase 1: Office of the Secretary of Defense internal (1 6 months) Consolidate existing guidance, prepare assessment implementation plan Issue assessment implementation plan Define process for setting measureable objectives and data collection, assign initial roles and responsibilities at HQ and in the field (COCOMs) Begin development of automated data collection tool Perhaps with the support of a new assessment office in the Office of the Deputy Assistant Secretary of Defense for Partnership Strategy and Stability Operations Phase 3: Implementation (12+ months) Begin execution of assessment implementation plan Phase 2: Involve other stakeholders (6 12 months) Hold a strategic-level meeting, to include OUSD(P), DSCA, DoS, COCOM J4 (logistics)/j5 (strategic plans and policy)/j8 (force structure, resources, and assessment), and the services at the GS-15/colonel level, on 1206 to complement the execution-level meetings run by DSCA Socialize assessment concept, coordinate guidance, solicit feedback Identify strategic-level assessment process for 1206 (cost-effectiveness level) Share assessment implementation plan with key congressional committees Confirm and finalize assigned stakeholder roles Refine automated tool RAND TR1121-S.4

21 Acknowledgments The authors are grateful for the support of many individuals over the course of this study effort. First, we would like to express our sincere thanks for the terrific support from our study sponsors in OASD(SO/LIC&IC), specifically Christel Fonzo-Eberhard, Shannon Culbertson, LTC John Gossart, and Ashley Pixton. We also thank the officials at DSCA, the COCOMs, service headquarters, and DoS and the senior congressional staffers who provided valuable insights that greatly improved the content of this report. The study team also benefited from the thoughtful and careful reviews provided by Alan Stolberg at the Army War College and Michael Shurkin at RAND. Finally, we thank Michael Neumann for his structural feedback and Melissa McNulty and Cassandra Tate for their administrative support. Any errors are solely the responsibility of the authors. xix

22

23 Abbreviations AOR COCOM DoD DoDI DoS DSCA FAA FMS GAO GEF IPR J5 NDAA OASD(SO/LIC&IC) OSD OUSD(P) TCP USAFRICOM USCENTCOM USEUCOM USPACOM USSOUTHCOM area of responsibility combatant command U.S. Department of Defense U.S. Department of Defense instruction U.S. Department of State Defense Security Cooperation Agency Foreign Assistance Act Foreign Military Sales U.S. Government Accountability Office Guidance for the Employment of the Force interim progress review U.S. Joint Chiefs of Staff Office of Strategic Plans and Policy National Defense Authorization Act Office of the Assistant Secretary of Defense for Special Operations/Low-Intensity Conflict and Interdependent Capabilities Office of the Secretary of Defense Office of the Under Secretary of Defense for Policy theater campaign plan U.S. Africa Command U.S. Central Command U.S. European Command U.S. Pacific Command U.S. Southern Command xxi

24

25 CHAPTER ONE Introduction The U.S. government has long worked with allies and partners in a security cooperation context to build their capacities to counter a variety of internal and external threats. The security cooperation activities conducted by the U.S. Department of Defense (DoD) range from the very visible (i.e., training, equipping, and exercising with others) to the less obvious, such as workshops and conferences, staff talks, and providing education. Yet, it is challenging to identify how and to what extent these various activities have contributed to U.S. objectives whether at the national security, department, combatant command (COCOM), or service level. Security cooperation efforts are both long-term and geographically dispersed, and there is currently no comprehensive framework for assessing these efforts. Security cooperation assessments are important at all levels policy, program manager, and execution because assessments support decisions. At the highest levels, assessments provide an organization, such as the Office of the Under Secretary of Defense for Policy (OUSD[P]), with a basis for comparing the effectiveness and efficiency of the many programs it manages. Using these comparisons, assessments can offer informed suggestions for how to improve existing programs or initiate new programs in DoD where gaps are identified. At the program level, assessments assist program and resource managers in making decisions about future security cooperation activities specifically, whether to cut, expand, alter, or continue current program activities based on their impact. At the execution level, assessments provide important insights into how to most effectively implement activities. The OUSD(P)-managed Global Train and Equip 1206 Program (hereafter referred to as the 1206 Program ) is one such security cooperation program that could benefit greatly from a comprehensive assessment approach. In its April 2010 report on the 1206 Program, the U.S. Government Accountability Office (GAO) recommended that DoD 1. establish a monitoring and evaluation system 2. base sustainment funding decisions on the assessment of results 3. estimate sustainment costs and seek funding commitments from partner nations 4. seek guidance from Congress on how to sustain projects. DoD concurred with these recommendations. 1 This report focuses primarily on the first recommendation, the establishment of a system to evaluate the results (or effectiveness/impact) of the program using objective performance 1 U.S. Government Accountability Office, International Security: DoD and State Need to Improve Sustainment Planning and Monitoring and Evaluation for Section 1206 and 1207 Assistance Programs, Washington, D.C., GAO , April 15, 2010, p. 1. 1

26 2 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? measures, monitoring the program s progress in achieving goals, and reporting this progress in annual performance reports. Such a system can assist program managers as they develop initiatives that provide appropriate capabilities and capacities that partner nations are likely to employ, and it can help answer key questions regarding the 1206 Program s effectiveness in developing them. The analysis conducted by the RAND team can also inform the second GAO recommendation, as the framework we have developed includes assessing the results, or rather the impacts made on partner nations through the 1206 Program activities what we call outcomes. The latter two GAO recommendations were beyond the scope of this study. The Global Train and Equip 1206 Program: An Overview The National Defense Authorization Act (NDAA) for Fiscal Year 2006 established the authority for the 1206 Program. Section 1206 of the NDAA authorizes the Secretary of Defense, with the concurrence of the Secretary of State, to spend up to $350 million each year to train and equip foreign military and nonmilitary maritime forces, including coast guards, to conduct counterterrorist operations or to support military and stability operations in which U.S. armed forces are involved. This authority is set to expire at the end of fiscal year 2011, but Congress may renew Section 1206, as it has in the past. By law, Section 1206 authority has two discrete uses: to build the capacity of a foreign country s national military forces to enable that country to conduct counterterrorism operations or participate in or support military and stability operations in which U.S. armed forces are involved to build the capacity of a foreign country s maritime security forces to conduct counterterrorism operations. A unique characteristic of the 1206 Program is the required coordination between DoD and the U.S. Department of State (DoS). The NDAA regulations established an interagency implementation process. Within DoD, the Office of the Assistant Secretary of Defense for Special Operations/Low-Intensity Conflict and Interdependent Capabilities (OASD[SO/LIC&IC]) is charged with overall responsibility for the 1206 Program. This office coordinates primarily with DoS s Bureau of Political-Military Affairs. All 1206 Program project proposals are considered within this interagency process. The 1206 Program process includes a number of essential steps. DoD and DoS annually solicit project proposals, which are revised each year in accordance with guidelines and instructions to reflect lessons learned, congressional concerns, and other considerations. The proposal template requires stakeholders, such as country team representatives and COCOM staffs, to include basic assessment information, including how the effectiveness of the project will be measured, the expected time frame for realizing operational impacts, key train and equip milestones, and quantitative and qualitative metrics that will be used to measure the effectiveness of the program. Interagency boards review the proposals, which must be approved by both the relevant U.S. combatant commander and ambassador, and select projects to recommend to the Secretaries of Defense and State for final funding approval. Once projects are fully approved and funded, DoD and DoS, after notifying designated congressional committees,

27 Introduction 3 may implement them. 2 For approved projects, the Defense Security Cooperation Agency (DSCA), operating under guidance from OUSD(P), assumes overall responsibility for procuring training and equipment. Security assistance officers posted at U.S. embassies (who report to both the ambassador and the relevant U.S. geographic combatant commands) are then responsible for coordinating in-country project implementation. 3 As with any other security cooperation program, the 1206 Program has administrative rules and guidance that must be followed by all 1206 Program stakeholders. First, COCOMs and appropriate embassy officials jointly formulate all 1206 Program proposals. Second, 1206 Program funding is subject to restrictions for sanctioned countries and those with known human rights abuses. Third, 1206 Program funding must be obligated by the end of each fiscal year. Fourth, 1206 Program funding does not support Iraqi or Afghan security forces. 4 Fifth, all 1206 Program proposals undergo a legal, political-military feasibility review. 5 For the purposes of this report, it is worth noting that the current guidance on the program does not, however, mention the need to conduct assessments at the project level. In general, 1206 Program funding should not be used to 1. backfill DoD or DoS shortfalls 2. fund equipment with extraordinarily long production lead times that will not meet near-term military needs 3. fund activities that are, in reality, counternarcotics missions 4. fund programs that must be continued over long periods (more than three years) to achieve as opposed to sustain a partner s capability. 6 Specifically, equipment, supplies, or training may be provided to build the capacity of a foreign country s national military forces to conduct counterterrorist operations or military/stability operations in which the U.S. military is involved. The Importance of Assessments Assessing the impact of security cooperation efforts is inherently difficult, but it is important for providing stakeholders at all levels of government with effective tools to determine which aspects of these investments are most productive and which areas require refinement. Those who plan and execute security cooperation efforts intuitively know whether their individual programs have successfully gained ground with their respective partner nations. At the most basic level, officials assert that the U.S. partner nation relationship is simply better than it was prior to executing the activity. 2 Congressional notification is required not less than 15 days before initiating assistance in any country. 3 U.S. Government Accountability Office, 2010, p Although not expressly addressed in the legislation, it was not the intent of Congress to provide another funding source for Iraqi and Afghan security forces through the 1206 Program authority. However, it is allowable to use 1206 Program funds to train and equip partner nations preparing to deploy to Iraq and Afghanistan in support of U.S. forces. 5 Discussion with OASD(SO/LIC&IC) officials. 6 Discussion with OASD(SO/LIC&IC) officials.

28 4 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Although these assertions appear to ring true, it is difficult to empirically validate such a general sense of accomplishment, especially for senior DoD policymakers and Congress. At present, assessments of security cooperation activities, to the extent that they are done, are largely conducted by the organization that executed the activities. Thus, these assessments, no matter how carefully carried out, are subject to concerns about bias on the part of the assessors. Moreover, self-assessment is even less convincing when conducted by program managers, planners, and executers, who often rotate rapidly, sometimes within the span of a year. Due to the transience of their positions, these individuals do not typically have the experience necessary to understand and evaluate a program s long-term effectiveness in a country. In addition to subject or functional expertise, objectivity and longevity are therefore two important characteristics for participants in the assessment process. DoD decisionmakers rely on assessments to establish programmatic priorities and determine the allocation of resources. Objective assessments provide valuable data on which to ground meaningful discussions about program funding. As mentioned previously, assessments provide the necessary justification for continuing, expanding, altering, or cutting back on existing programs. Without insight into the current successes and weaknesses of security cooperation programs, it is impossible for DoD officials to make informed decisions about how best to allocate those resources available for their use. In addition to serving the needs of high-level decisionmakers, assessments provide critical information to those directly involved in the planning and implementation of security cooperation programs. Ultimately, quality assessment of these programs contributes to improved decisionmaking at all stages, including oversight, planning, management, resourcing, and execution, and it has the potential to increase both the effectiveness and the efficiency of security cooperation efforts. This report provides a framework for thinking about, planning for, and implementing security cooperation assessments for the 1206 Program managed by OUSD(P). 7 Study Objectives and Analytical Approach RAND was asked by OASD(SO/LIC&IC), within OUSD(P), for assistance in identifying key stakeholders, their roles, and sources of data in support of a comprehensive assessment of the 1206 Program. RAND was also asked to develop the key elements of an implementation plan that would allow the repeated assessment of the program s outcomes and cost-effectiveness. The study included five main tasks. The first involved identifying current roles, data sources, and ongoing assessment processes through a series of discussions with key policymakers, legislators, and project implementers in the field. As part of the second task, the study team developed and deployed a survey designed to gather information on the roles, processes, and responsibilities of stakeholder organizations in the 1206 Program, as well as to elicit information regarding assessment guidance and skills. 7 See Jennifer D. P. Moroney, Jefferson P. Marquis, Cathryn Quantic Thurston, and Gregory F. Treverton, A Framework to Assess Programs for Building Partnerships, Santa Monica, Calif.: RAND Corporation, MG-863-OSD, 2009, pp See also Jennifer D. P. Moroney, Joe Hogler, Jefferson P. Marquis, Christopher Paul, John E. Peters, and Beth Grill, Developing an Assessment Framework for U.S. Air Force Building Partnerships Programs, Santa Monica, Calif.: RAND Corporation, MG-868-AF, 2010, and Jefferson P. Marquis, Richard E. Darilek, Jasen J. Castillo, Cathryn Quantic Thurston, Anny Wong, Cynthia Huger, Andrea Mejia, Jennifer D. P. Moroney, Brian Nichiporuk, and Brett Steele, Assessing the Value of U.S. Army International Activities, Santa Monica, Calif.: RAND Corporation, MG-329-A, 2006.

29 Introduction 5 The third task involved analyzing the survey results to determine which stakeholders were best suited for the collection of data in support of 1206 Program assessments and which stakeholders could potentially conduct such assessments. The analysis was based on the five levels of the assessment hierarchy: Level 1, need for the program (or activity), focuses on the problem to be solved or goal to be met and identifies the population to be served and the kinds of services that might contribute to a solution. Once a needs assessment establishes that there is problem to resolve or a policy goal worth pursuing, different solutions can be considered. This is where theory connects various types of projects or activities to strategic goals. Level 2, design and theory, focuses specifically on the design of a policy, project, or activity. Level 3, process and implementation, concerns the execution of the elements prescribed by the design and theory in level 2. Level 4, outcome and impact, measures the changes (often long-term) that ultimately result from the program s efforts. Level 5, cost-effectiveness, assesses the relative value gained from the program as compared with other, similar efforts. We think of these five aspects, integral to the assessment framework, as a hierarchy, depicted graphically in Figure 1.1. In this hierarchy, a positive assessment at a higher level relies on positive assessments at the lower levels. By positive, we mean that the assessment reveals that associated objectives are being met. Accordingly, problems at a higher level of the hierarchy link to problems at lower levels of the hierarchy. For example, if a cost-effectiveness assessment reveals a problem, examining information from the lower levels of the hierarchy will lead to the root cause of the problem. In the fourth task of the study, the RAND team collated the findings from the survey analysis and provided those findings and other observations to the sponsor. Based on the survey findings, we developed specific recommendations and identified key elements of an assessment implementation plan for the sponsor s consideration, the fifth task of the study. Figure 1.1 Five Levels: The Assessment Hierarchy Level 5 Level 4 Level 3 Level 2 Assessment of cost-effectiveness Assessment of outcome/impact Assessment of process and implementation Assessment of design and theory Level 1 Assessment of need for the program SOURCE: Adapted from Rossi, Lipsey, and Freeman, 2004, Exhibit 3-C. Used with permission. NOTE: For a detailed discussion of the assessment hierarchy, see Paul et al., RAND TR

30 6 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Particular attention was paid to any identified gaps that might limit the ability of OUSD(P) to assess the 1206 Program. This report lays the groundwork for a comprehensive assessment of the 1206 Program rather than for various snapshot-in-time assessments of specific 1206 projects around the world. It takes a longer-term view, with the understanding that accurate assessments, especially those focused on outcomes and cost-effectiveness, require time and effort. However, without well-informed knowledge of what does and does not work in terms of security cooperation, program managers and policymakers will be unable to make educated decisions regarding 1206 Program expenditures. Organization of This Report Chapter Two provides an overview of the RAND team s security cooperation and assessment construct and explains why assessment is important, how to think about assessments, how security cooperation assessments should be conducted, and the utility of assessment results in informing decisionmaking. Chapter Three discusses the results of focused interviews that the team conducted with key 1206 Program stakeholders in OUSD(P), DSCA, DoS, the COCOMs, the military services, and Congress, identifying five key insights that helped shape the recommendations presented later in this report. Chapter Four examines the results of the 1206 Program survey analysis conducted by the RAND team. Chapter Five presents five key findings and nine key recommendations resulting from the analysis, as well as details on the key components of an assessment implementation plan along two parallel tracks. It is important to note that this study draws on previous RAND research sponsored by the Office of the Deputy Assistant Secretary of Defense for Partnership Strategy and Stability Operations, the U.S. Air Force, and the U.S. Army and documented in the following publications: Developing an Assessment Framework for U.S. Air Force Building Partnerships Programs, by Jennifer D. P. Moroney, Joe Hogler, Jefferson P. Marquis, Christopher Paul, John E. Peters, and Beth Grill, Santa Monica, Calif.: RAND Corporation, MG-868-AF, Adding Value to Air Force Management Through Building Partnerships Assessment, by Jefferson P. Marquis, Joe Hogler, Jennifer D. P. Moroney, Michael J. Neumann, Christopher Paul, John E. Peters, Gregory F. Treverton, and Anny Wong, Santa Monica, Calif.: RAND Corporation, TR-907-AF, Developing an Army Strategy for Building Partner Capacity for Stability Operations, Jefferson P. Marquis, Jennifer D. P. Moroney, Justin Beck, Derek Eaton, Scott Hiromoto, David R. Howell, Janet Lewis, Charlotte Lynch, Michael J. Neumann, and Cathryn Quantic Thurston, Santa Monica, Calif.: RAND Corporation, MG-942-A, A Framework to Assess Programs for Building Partnerships, by Jennifer D. P. Moroney, Jefferson P. Marquis, Cathryn Quantic Thurston, and Gregory F. Treverton, Santa Monica, Calif.: RAND Corporation, MG-863-OSD, Building Partner Capacity to Combat Weapons of Mass Destruction, by Jennifer D. P. Moroney and Joe Hogler, with Benjamin Bahney, Kim Cragin, David R. Howell, Charlotte Lynch, and S. Rebecca Zimmerman, Santa Monica, Calif.: RAND Corporation, MG-783-DTRA, 2009.

31 CHAPTER TWO RAND s Assessment Framework This chapter provides an overview of the RAND team s security cooperation and building partnerships assessment construct and argues for the relevance and importance of assessment for the 1206 Program. This chapter draws heavily on recently published RAND research for the U.S. Air Force and OUSD(P) on security cooperation assessment frameworks, as mentioned in Chapter One, which explain why assessment is important, how to think about assessments, how security cooperation assessments should be conducted, and the utility of assessment results in informing decisionmaking. The chapter begins by explaining the basic rationale for conducting security cooperation assessments and provides some examples of ongoing challenges to assessment in this area. It then explains how the framework connects to the key insights presented later in this report. What Is Assessment? Assessment is research or analysis to inform decisionmaking. When most people think of evaluation or assessment, they tend to think of outcomes assessment: Does the object of the assessment work? Is it worthwhile? While this is certainly within the purview of assessment, assessments cover a much broader range. Most assessments require using research methods common in the social sciences. However, assessment is distinguished from other forms of research in terms of purpose. Assessment is fundamentally action-oriented. Assessments are conducted to determine the value, worth, or impact of a policy, program, proposal, practice, design, or service with a view toward making change decisions about that program or program element in the future. In short, assessments must explicitly connect to informing decisionmaking. Within the action-oriented or decision-support role, assessments can vary widely. They can support decisions to adjust, expand, contract, or terminate a program, project, or activity. They can support decisions regarding which services a project should deliver and to whom. Assessments can also support decisions about how to manage and execute a program or program elements. Why Assess? Although some decisions are based on ad hoc or intuitive assessments, others demand assessments that incorporate more extensive or rigorous research methods, particularly when the pro- 7

32 8 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? gram involves large resource commitments. Where there are important decisions to be made and ambiguities exist about the factual bases for those decisions, assessment is the antidote. Across most aspects of government and military activity, there are regular calls for assessments; the 1206 Program is no exception. For example, DoD and DoS are required to submit an annual report to Congress that assesses the status of projects under way or completed during the prior year. COCOMs and component commands also conduct assessments of COCOM theater campaign plans (TCPs) throughout the year. Although these assessments are typically carried out at the TCP objective or country level, 1206 Program resources are an important contributor to many of these assessments. Additionally, in 2010, a new office was established in the Office of the Deputy Assistant Secretary of Defense for Partnership Strategy and Stability Operations within OUSD(P) to develop an approach to assessing the security cooperation programs that OUSD(P) directly manages, including the 1206 Program and several others, such as the DoD Regional Centers, the Counterterrorism Fellowship Program, and the Warsaw Initiative Fund. 1 Answering questions as to the real or anticipated outcome of DoD security cooperation programs has become commonplace throughout the DoD policymaking community. Challenges to 1206 Program Assessment We would be remiss not to discuss the challenges to conducting assessments with regard to security cooperation more broadly and the 1206 Program in particular. These challenges are not insurmountable; some are endemic and some are more focused on process. However, it is important to keep them in mind to find solutions or develop workarounds that will enable DoD to implement desired assessment processes. Determining Causality Arguably, the biggest challenge confronting assessment in this area lies in trying to identify causality: linking specific 1206 projects and activities to specific progress toward program goals, broader COCOM or U.S. objectives, and end states (outcomes). 2 The abundance of initiatives in the broader realm of U.S. security cooperation in DoS, other DoD programs, the U.S. Agency for International Development, and the Departments of Justice, Homeland Security, Energy, Treasury, and Commerce confound our ability to assign causality, as do various exogenous factors, such as international politics, global public diplomacy, and partner nations themselves. The best we can hope for at the outcomes level, in many instances, is to find some relationship between success in the 1206 Program and progress in security cooperation focus areas. Well-Articulated Intermediate Goals to Inform Decisionmaking As stated previously, assessments are tied to decisionmaking. However, a critical assessment challenge is identifying the information on which decisions should be based. For example, it is fairly intuitive to decide whether or not to continue a project or activity based on whether 1 See key recommendations made in Moroney, Marquis, et al., See Marquis, Moroney, et al., 2010, Appendix D.

33 RAND s Assessment Framework 9 or not it is achieving its objectives. However, it is analytically very difficult to tell whether something is working when causal connections are conflated with other activities or end states and when goals are very high-level, opaque, or difficult to measure or when they require that a program or activity contribute only indirectly. Well-articulated intermediate goals to which programs can directly contribute are important for effective program assessment. However, where such goals are lacking, decisions are difficult to support with assessment. This connects directly to one of the five key organizational insights presented in this report: the need to establish measurable objectives that explicitly connect to broader U.S. government, theater, regional, and 1206 Program goals. Assessment Capabilities of 1206 Program Stakeholders Effort is required for both the collection of raw data and the analysis needed to produce completed assessments. Resource constraints can adversely affect the quality of data collection. Different organizations have differing levels of preparation and capability for assessment. Some 1206 Program stakeholders have access to personnel who can help with assessment or have sufficient staffing (and foresight) to provide dedicated assessment personnel. Other stakeholder offices are very tightly staffed, with just a few personnel already wearing multiple hats and working long hours before assessment even enters the picture. The 1206 Program is a mixed bag in this regard. Good assessment planning and assessment matching can ease the resource burden. Relevant personnel will be better able to plan for and complete assessment data collection if they know about it before the period or event for which they will collect data. A single set of coherent assessment data requests requires less time to complete than a host of different and partially duplicative or partially useless assessment data calls. While Chapter Four provides greater detail about current capabilities to collect data and conduct assessment among 1206 Program stakeholders, three needed improvements in this area connect directly to several of the key insights and will help address this challenge: the need to provide formal guidance on assessment processes, the need to establish 1206 Program assessment roles and responsibilities, and the need to set data collection and reporting requirements. Multiplicity of and Differing Priorities of Stakeholders The 1206 Program has a host of different stakeholders. Decisions for and about the program and its projects and activities are made by many different organizations and at many different levels. The constellation of stakeholders varies by region and organizational level. Although the inclusion of many stakeholders is not inherently challenging, it can complicate assessments in a number of ways. For example, personnel at the project execution level can have multiple masters with different goals Program projects are typically designed by COCOM planners to help support the achievement of TCP objectives, but they are often executed by military service organizations that are focused on compliance with acquisition regulations and technology transfer legislation. These competing goals can complicate assessment when different stakeholders request different but similar assessments using different processes. Security Cooperation Data Tracking Systems Are Not Currently Organized for 1206 Program Assessment DoD and U.S. government security cooperation programs and funding are widely dispersed in terms of who is responsible for them. Some security cooperation data are maintained in the

34 10 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? COCOMs respective Theater Security Cooperation Management Information Systems, but not all 1206 Program stakeholders provide input, nor do they all have access to these systems. As a result, it is not clear that a complete, accurate, or current repository of 1206 Program projects and activities and their details (e.g., resources involved, place, duration, frequency) exists. COCOMs have created new databases to track 1206 Program inputs, but generally, these databases are used only for internal purposes and are not tied to assessing the overall effect of the program in specific countries. Delegating Assessment Responsibilities There is also the practice, widespread in DoD, of delegating the task of assessment to subordinate organizations. Although this practice may be effective at the upper echelons of OUSD(P), it can cause trouble for multiple reasons. The first problem is that many of the officers and staffers charged with this responsibility lack the skills to design and perform assessments. Often, without an assessment template and a dataset at hand, they must conceive and execute the assessment without any high-level guidance. Even in organizations with appropriately trained staff, the necessary data are rarely fully available, and potential sources are not obvious. Unless 1206 Program guidance documents specify the types of assessments expected from particular commands, agencies, offices, or organizations and outlines steps to collect and organize the supporting information, individual offices will have little choice but to continue the common practice of polling subject-matter experts for their opinions of how various projects and activities have performed. The instinct behind approaching subject-matter or functional experts is correct, but without proper assessment guidance, the effort will be ad hoc at best. Expectations and Preconceived Notions of Assessment A final challenge inherent in 1206 Program assessment stems from the expectations and preconceived notions of many stakeholders. There are many different views about what assessment is or should be. Virtually all military officers and senior civilians have some experience with assessment, but usually just a limited slice of what is possible under the broad tent offered by evaluation research. A narrow preconception that assessment is only ever one type of analysis or data collection can be limiting. Further, expectations that assessment adds limited value or that it is acceptable to require assessments to satisfy curiosity (rather than inform essential decisions) can lead to unnecessary evaluations or create resistance to assessment proposals. In fact, assessment is many different things from many different perspectives. Virtually all of these perspectives provided they pertain to decisionmaking are captured in the hierarchy of evaluation, described next. Levels of Assessment: The Hierarchy of Evaluation Given the explicit focus on assessment for decisionmaking that comes from evaluation research and the necessity of connecting stakeholders and their decisionmaking needs with specific types of assessment, OUSD(P) needs a unifying framework to facilitate that matching process. To fill this need, we present the hierarchy of evaluation as developed by evaluation researchers

35 RAND s Assessment Framework 11 Peter Rossi, Mark Lipsey, and Howard Freeman (see Figure 1.1 in Chapter One). 3 The RAND team found this to be the most useful model of those available in the literature. The hierarchy divides all potential evaluations and assessments into five nested levels. Each higher level is predicated on success at a lower level. For example, positive assessments of cost-effectiveness (the highest level) are only possible if supported by positive assessments at all other levels. We elaborate on this concept in the section Hierarchy and Nesting, later in this chapter. Level 1: Assessment of Need for the Program Level 1, at the bottom of the hierarchy and serving as its foundation, is the assessment of the need for the program or activity. This is where evaluation connects most explicitly with target ends or goals. Evaluation at this level focuses on solving the problem and determining the goal(s) to be met, the population to be served, and the kinds of services that might contribute to a solution. 4 In the context of the 1206 Program, this level of assessment is perhaps best applied at the project proposal stage. During the proposal review process, Office of the Secretary of Defense (OSD), Joint Staff, and DoS program managers use the following assessment questions: What is the terrorist threat? What is the capability gap? What is the consequence of not conducting the assessment? Do the training and equipment requested fill the capability gap? For DoD, the need for the overall 1206 Program is not what is being explored; instead, program managers and other stakeholders might consider assessing the need for individual projects or collections of interrelated projects. For example, consider a project that proposes to provide a partner with a maritime counterterrorism capability. Additional potential assessment questions include the following: What is the nature and magnitude of the terrorist threat? What audience, population, or targets does the need apply to (e.g., maritime border patrol, Navy)? What kinds of services or activities are needed to address the problem? What existing programs or activities contribute to meeting this goal or mitigating this problem? What are the goals and objectives to be met through the project? What are the risks of not taking action? Evaluation of public policy often skips the needs-assessment level, as stakeholders assume the need to be wholly obvious. This is true broadly in public policy, but also in DoD. Where such a need is genuinely obvious or the policy assumptions are strong, this is not problematic. Where need is not obvious or goals are not well articulated, troubles starting at level 1 in the evaluation hierarchy can complicate assessment at each higher level. 3 Rossi, Lipsey, and Freeman, Rossi, Lipsey, and Freeman, 2004, p. 76.

36 12 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Level 2: Assessment of Design and Theory The assessment of concept, design, and theory is the second level in the hierarchy. Once a needs assessment establishes that there is problem or policy goal to pursue, as well as the intended objectives of such policy, different solutions can be considered. This is where theory connects ways to ends. In developing a project proposal to address the maritime security example, one might consider the various types of activities that might be undertaken to achieve the objective of providing a counterterrorism capability from a design and theory perspective. Equipment such as small craft, uniforms, small arms, or radios might be desired, as would the training necessary for their operation. Exercises conducted with U.S. forces to demonstrate the capability might be a culminating activity, and the establishment of an ongoing bilateral working group to discuss further progress might also be appropriate. While all of these activities may sound logical, a deeper assessment of the design and the theory behind them is essential. Assessment at this level focuses on the design of the project and its activities and should begin immediately as a way to ensure that the project is sound from the beginning. Analyses of alternatives are generally evaluations at this level and can be useful in determining the best approach. Research questions might include the following: What types of activities are appropriate for providing and sustaining a counterterrorism capability? What specific services are needed, in what quantity, and for how long? How can these services best be delivered? What outputs (e.g., training, equipment deliveries) need to be produced? How should the project and its activities be organized and managed? What resources will be required for the project and its activities? Is the theory specifying certain services as solutions to the target problem sound? What is the partner s ability to absorb the assistance? Will the partner be willing to employ the capability? The hierarchy bases most of the evaluation questions at this level on theory or on previous experience with similar efforts. This is a critical level in the hierarchy. If project design is based on poor theory, then perfect execution (the ways) may still not bring about the desired results (the ends). Similarly, if the theory does not actually connect the ways with the ends, the program may accomplish objectives other than the stated ones. Unfortunately, assessors often omit this step or restrict their efforts at this level of evaluation by making unfounded assumptions, an issue addressed in more detail later. Once an effort is under way, design and theory can be assessed firsthand. For example, once the maritime equipment has been delivered and training has commenced, assessment questions at this level could include the following: Are the services provided adequate in duration and quantity? Is the frequency with which the services are provided adequate? Are resources sufficient for the desired execution? Note that assessments at this level are not about execution (i.e., Are the services being provided as designed? ). Such questions are asked at the next level, level 3. Design and theory assessments (level 2) seek to confirm that what was planned is adequate to achieve the desired

37 RAND s Assessment Framework 13 objectives. In other words, the question is not whether the boats have been delivered but rather, Were boats the right solution to the problem? Level 3: Assessment of Process and Implementation Level 3 in the hierarchy of evaluation focuses on program operations and the execution of the elements prescribed by the theory and design in level 2. A program can be perfectly executed but still not achieve its goals if the design was inadequate. Conversely, poor execution can foil the most brilliant design. In the case of the maritime security example, a well-designed activity to provide small maritime patrol craft could fail to achieve the desired results if not supported by adequate training or plans for sustainment and maintenance. Assessment at this level needs to be periodic and ongoing. In addition to measuring the effectiveness of processes, level 3 evaluations also include an assessment of outputs, i.e., the quantifiable deliverables of an effort. Such evaluations might include an assessment of the timeliness of the Foreign Military Sales (FMS) program s process, as well as the accuracy of the equipment delivered. Possible research questions at this level include the following: Were the necessary resources made available? Are the intended services, such as training and equipment, being delivered as designed? Are process and administrative objectives, such as those of the FMS process, being met? Is the activity being managed well? Is the partner nation satisfied with the equipment and training? Were applicable security assistance manuals and regulations followed? Are resources being used or consumed as intended? Level 4: Assessment of Outcome and Impact Level 4 is near the top of the evaluation hierarchy and concerns outcomes and impact. At this level, the hierarchy translates outputs into outcomes, a level of performance, or achievement. This translation must contain enough detail to explain the path from specific activities to specific capabilities. Put another way, outputs are the products of program activities, and outcomes are the changes resulting from the projects. In practical terms, one might be interested in assessing whether or not the partner nation has actually achieved a counterterrorism capability. Determining how well the combination of equipment, training, exercises, and sustainment provisions has translated to a real capability is the essence of this type of assessment. This is the first level of assessment that examines how well the solutions have addressed the problem that originally motivated the effort. Research questions at level 4 in this example could include the following: Do the patrol craft and trained crews provide benefits to the recipients in terms of countering terrorism? Did the training result in proficient crews that are able to engage in counterterrorism operations? Are program objectives and goals being achieved? Is the terrorism problem at which the project or activity is targeted being addressed sufficiently?

38 14 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Level 5: Assessment of Cost-Effectiveness The assessment of cost-effectiveness sits at the top of the evaluation hierarchy, at level 5. Efforts to assess cost-effectiveness are possible when at least partially observable desired outcomes are present. Evaluations at this level are often most attractive in bottom-line terms but depend heavily on data collected over time at the lower levels of evaluation. It can be complicated to measure cost-effectiveness in situations in which unclear resource flows or exogenous factors significantly affect outcomes. As the highest level of evaluation, this assessment depends on the lower levels and can provide feedback inputs for policy decisions primarily based on the lower levels. For example, if target levels of cost-efficiency are not met, cost data (level 5) in conjunction with process data (level 3) can be used to streamline the process or otherwise selectively reduce costs. Put another way, if the training on small craft procedures is too costly, then perhaps the training process is not efficient, or perhaps the partner simply cannot absorb the training (a level 2 issue regarding the design and theory of the program). Alternatively, the design (level 2) could be the problem. To repeat an earlier but important point, the levels of assessment are nested, and resolving issues such as cost-effectiveness relies heavily on understanding how the program is functioning at other levels. Possible level 5 research questions include the following: How efficient is resource expenditure versus outcome realized? Is the cost reasonable relative to the magnitude of benefits? Could alternative approaches yield comparable benefits at a lower cost? Hierarchy and Nesting This framework is a hierarchy because the levels nest with each other; solutions to problems observed at higher levels of assessment often lie at levels below. If the desired outcomes (level 4) are achieved at the desired levels of cost-effectiveness (level 5), then lower levels of evaluation are irrelevant. But what about when they are not? When desired high-level outcomes are not achieved, information from the lower levels of assessment needs to be available to be examined. For example, if an effort is not realizing target outcomes, is that because the process is not being executed as designed (level 3) or because the effort was not designed well (level 2)? Evaluators have problems when an assessment scheme does not include evaluations at a sufficiently low level to inform effective policy decisions and diagnose problems when the program does not perform as intended. Assuming away the lowest levels of evaluation is only acceptable if the assumptions prove correct. However, when assumptions are questionable, the best risk-avoidance strategy is to conduct assessments at levels 1 and 2 rather than launching a program that will fail at levels 4 and 5 because the critical levels simply will not support overall targets. According to Rossi, Lipsey, and Freeman, programs that fail generally do so because of problems at level 2 (theory) or level 3 (implementation). 5 Good program implementation works only if the underlying program design works. 5 Rossi, Lipsey, and Freeman, 2004, p. 78.

39 RAND s Assessment Framework 15 Key Insights and the Assessment Framework As previously noted, this report refers to five key insights that the study team drew from the results of this research effort. These insights were used to guide the development of the assessment implementation plan in Chapter Five. They are as follows: 1. Provide formal guidance on the assessment process. 2. Set measurable objectives that explicitly connect to broader U.S. government, theater, regional, and 1206 Program goals. 3. Establish 1206 Program assessment roles and responsibilities. 4. Set data collection and reporting requirements. 5. Improve coordination with key agencies. In the following section, we discuss only three of them (numbers 2, 3, and 4) as they relate explicitly and directly to the RAND team s assessment. Providing guidance and improving coordination are important aspects of implementing an assessment framework more broadly but do not relate directly to the assessment hierarchy. Assessment and Objectives As noted earlier, clear objectives are critical for successful assessment. Decisionmaking above the program management level relies on assessments at level 4 (outcome and impact) and level 5 (cost-effectiveness). It is virtually impossible to assess the success of an undertaking (outcome) unless the intended objective of that effort is clear. Objectives should be clear at each level of activity and should nest and clearly connect. In the 1206 Program context, this means that broad program goals should be clearly articulated and communicated to stakeholders, and ideally should connect explicitly to broader security cooperation goals provided by the National Security Strategy, the National Defense Strategy, the National Military Strategy, the Guidance for the Employment of the Force (GEF), and COCOM TCPs and functional campaign plans. Specific projects and activities should, in turn, have clearly stated objectives that connect explicitly to program-level objectives. Clear objectives make higher-level (levels 4 and 5) assessments possible, but they also facilitate lower-level assessments. For example, if objectives are clear and nest from lower to higher levels, design and theory are more likely to be explicit in the connections, enabling more transparency at that level (level 2). In this way, effective assessment guidance connects to and hinges on broader program guidance regarding program objectives and the establishment of objectives for program projects and activities. Assessment Roles and Responsibilities In general, there are four functional assessment roles that need to be performed with respect to 1206 Program assessment. The following are definitions for the four stakeholder assessment functions identified in prior RAND research: 6 6 See Moroney, Hogler, et al., 2010.

40 16 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Data collector. Responsible for collecting and aggregating data for a particular kind of assessment from internal and external sources according to standards set by the assessor organization. Assessor. Responsible for setting data collection standards for a particular kind of assessment and for conducting evaluations using methods suitable for the types of assessment being performed. Reviewer. Responsible for helping assessors to develop data collection standards and evaluation methods that are appropriate for the type of assessment being performed, as well as for conducting periodic inspections or audits to ensure that assessments are being properly executed. Integrator. Responsible for organizing and synthesizing programmatic assessments to meet assessment requirements and feed into decisionmaking processes. We intend for these roles to help guide assessment behavior, not to restrict the range of assignments that a particular organization or office is allowed to undertake. In many cases, assigning specific responsibilities to particular organizations will require looking beyond traditional practices. In particular, it is important to pay close attention to an organization s capabilities, especially its resources, expertise (in terms of both subject matter and function, e.g., acquisition, accounting, legal), proximity, and opportunity. It is also critically important to note its objectivity, i.e., the extent of its interest in specific assessment results. This consideration will facilitate a move away from the current, largely ad hoc self-assessment approach. There are some key principles in assigning stakeholder assessment roles: Delineate assessment responsibilities across several stakeholders to account for different levels of organizational authority and expertise and to inject as much objectivity into the process as possible. Identify a single organization with a close connection to the activity at hand to be ultimately responsible for gathering and collating assessment data. Note, however, that data collection will often involve a number of individuals and organizations from different parts of DoD (and even from outside). Recognize that, in some cases, the data collector and the assessor will be the same individual; more likely, these positions will be held by persons in the same organization. Ensure that the assessor and the reviewer are not the same person; they may be within the same organization, but this is not ideal. Ensure that reviewers (especially) and integrators pay careful attention to which data are collected and which attributes are selected as outputs and outcomes, lest attributes be designed to fit what the program has done and not necessarily its goals. Maintain strong linkages between integrators and program stakeholders to develop as much standardization as possible and to foster clarity on best practices in security cooperation assessment. Integrators should develop mechanisms for storing assessment information (so that it is available to as wide a group of program stakeholders as possible) and synthesizing this information for various decisionmaking purposes.

41 RAND s Assessment Framework 17 Data Collection and Reporting It is not enough to specify who has a data collection role. It is important to clarify what data are needed, how to collect and format the data, where they should be sent, and when they are due. To function optimally, such collection must be structured, and it must be mandatory. When data are late or absent, or collected or reported in an inconsistent manner, it can make the assessor s job more difficult (if not impossible). Conclusion Assessment is a challenging but critical tool to effectively support decisionmaking within and about the 1206 Program. The framework presented here intends to aid in the assessment process by breaking down a complex task into its various components. The discussion of the framework illustrates how to take a theoretical structure and tailor it to a specific program by linking it with concrete program objectives. Specifically, 1206 Program assessment processes need to include measurable objectives, the clear assignment of assessment roles and responsibilities, and collection and reporting requirements for data collection in support of all five levels of the hierarchy of evaluation.

42

43 CHAPTER THREE Exploring the Options for 1206 Program Assessment: Discussions with Key Stakeholders Most policymakers, military and defense planners, and project implementers perceive the 1206 Global Train and Equip Program as a unique security cooperation program. Its focus on urgent and emergent threats and building partner-nation capabilities for counterterrorism and stability operations has created new opportunities for partner engagement and led to closer interagency cooperation. In the four years since the inception of the 1206 Program, primary stakeholders have adopted new roles and responsibilities for implementing the program. Moreover, although they have also begun to develop methods of internal data collection and assessment, they have not participated in a comprehensive, program-wide assessment. To understand better how the 1206 Program is currently implemented and to identify stakeholders and sources of data that could be used to support future assessments, the RAND study team interviewed stakeholders drawn from a broad range of U.S. government participants at various levels. The information gained from these interviews was then used to develop the structured assessment survey discussed in Chapter Four. This chapter describes how the team conducted the interviews and provides a summary of stakeholder perceptions of the strengths and weaknesses of the 1206 Program and its current implementation process. It then presents stakeholder observations as they relate to the insights derived from the assessment framework. Stakeholder Interviews Members of the RAND study team conducted a series of 14 semistructured interviews with key stakeholders in the U.S. government to ascertain current roles in the 1206 Program, as well as to identify data sources and existing assessment processes for the program. The research team conducted in-person and telephone discussions with key individuals engaged in the planning, administration, and oversight of the 1206 Program. Interviewees included representatives from five geographic COCOMs, 1 U.S. Special Operations Command, OUSD(P), DSCA, the U.S. Joint Chiefs of Staff Office of Strategic Plans and Policy (J5), U.S. Army Security Assistance Command, the Navy International Programs Office, the Office of the Deputy Under Secretary of the Air Force for International Affairs, the Bureau of Political-Military Affairs in DoS, and four congressional committees. A total of 30 individuals participated in 14 separate 1 U.S. Africa Command (USAFRICOM), U.S. Central Command (USCENTCOM), U.S. European Command (USEUCOM), U.S. Pacific Command (USPACOM), and U.S. Southern Command (USSOUTHCOM). 19

44 20 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? interviews (see Appendix B). We do not attribute interview responses to specific offices or individuals in this chapter. During the interviews, we asked stakeholders to describe their roles, responsibilities, and experience with the 1206 Program. We asked both planners and implementers about the objectives of the 1206 Program projects with which they were involved and how they determined whether they were achieving these objectives. We also inquired about the aspects of 1206 Program projects that they found to be most successful based on their personal experience, as well as those aspects that posed the greatest challenges. In addition, we requested that stakeholders offer their suggestions for developing and implementing a new assessment process. The results of these interviews were used to inform both the structured online survey that is described in Chapter Four (and presented in Appendix A) and the overall conclusions and recommendations presented in Chapter Five. Stakeholder Perceptions of the 1206 Program During our interviews, 1206 Program stakeholders identified a number of important issue areas, discussed in this section. Stakeholders from OUSD(P), DSCA, DoS, the COCOMs, and the services expressed a common understanding of the goals and objectives of the 1206 Program. They emphasized the program s focus on urgent and emergent threats as well as building partner-nation capacity for counterterrorism and stability operations. Most noted that the program is supposed to reflect COCOM needs while maintaining a narrow focus on activities that directly support current U.S. security objectives. Moreover, nearly all stakeholders commented that the program required increased cooperation between agencies (particularly between DoS and DoD). They noted, however, that there are both strengths and weaknesses in the current implementation of the 1206 Program, particularly regarding the one-year timeline, program sustainability, the process of project design and selection, and interagency cooperation. Many stakeholders also commented that although they believed that the implementation process had improved over time, there was no formal assessment procedure in place to evaluate either the implementation process or program outcomes. Although some offices or commands have conducted internal assessments, they are typically informal, not comprehensive, and not shared with other stakeholders. Stakeholders provided a number of suggestions for developing an effective, comprehensive assessment framework, described at the end of this chapter, that could build on existing mechanisms for data collection. Short Timeline Allows Faster Impact but Creates Administrative Difficulties Stakeholders stated that the short timeline dictated by the 1206 Program s legal authority creates both positive and negative effects. The requirement that 1206 Program funding be obligated by the end of the fiscal year forces offices to prioritize tasks related to 1206 Program execution, enabling 1206 Program projects to be implemented relatively quickly. They noted that this accelerated implementation has, in turn, allowed the 1206 Program to have a more immediate effect on partner-nation capabilities than other initiatives funded by FMS or other security assistance programs. COCOM stakeholders noted that while other security assistance projects often take three to four years to process, under the 1206 Program, partner nations have received equipment such as night-vision goggles and communication hardware in less

45 Exploring the Options for 1206 Program Assessment: Discussions with Key Stakeholders 21 than a year. COCOM stakeholders also noted that the focus on urgent and emergent threats as a guideline for 1206 Program funding facilitates partnership capacity-building projects that would not otherwise be possible under other authorities, in terms of both the type of equipment and training provided and the selection of recipient countries. For example, countries that otherwise would not have received priority for security cooperation funds, such as those facing emerging threats in Africa, have received much-needed training and equipment through 1206 funds. Additionally, the 1206 Program facilitated the provision of equipment for counterterrorism missions and training for stability operations in Afghanistan that might not otherwise have been available to partner nations. At the same time, most stakeholders observed that the abbreviated timelines of the 1206 Program could create significant administrative difficulties. Short deadlines often leave less time for planning and can lead to sloppiness in the preparation of project proposals, according to those who review and implement the proposals. The process for executing 1206 Program projects follows that of the pseudo case system used in the FMS program, which is not designed to function within the single fiscal-year time frame required by the 1206 Program authority. The complexities of the approval and acquisition process require DSCA and the services to request special legal exemptions to the standard contracting process for many of the programs selected for funding. Additionally, the services may face difficulties in facilitating equipment orders within the fiscal year time frame, particularly when aircraft or military vehicles are involved. These items require significant groundwork in developing and validating cases and completing the contracts required to transfer them to partner nations. The short time frame also makes staffing to support the 1206 Program a challenge. Stakeholders at all levels noted the difficulty in accommodating the increased workload necessary to administer the program without additional staff, and 1206 Program roles are therefore treated as an additional temporary duty by most offices. Long-term civilian employees able to observe 1206 Program projects over time may alleviate this problem and provide continuity. Some service representatives questioned whether the workload was out of proportion compared with the dollar value of the program appropriation. Lack of Funding for Maintenance and Support Creates Concerns About Sustainability The sustainability of 1206 Program funding was a concern raised by many stakeholders during our interviews. One COCOM representative indicated that, because the focus on counterterrorism is particularly relevant to both the United States and many partner nations, participating partner nations have a greater interest in maintaining new capabilities acquired through the 1206 Program, as compared with other types of capabilities or materiel. 2 However, stakeholders also noted that the absence of a requirement for partner nations to contribute their own resources for the training and equipment is a potential impediment to the sustainability of the 1206 Program, as some partner nations lack both the commitment and the resources required to support new capabilities over time. Several stakeholders also expressed concern that once partner nations receive 1206 Program funding, they may develop an appetite for training and equipment that is not sustain- 2 When the 1206 Program commenced in 2006, there was no provision for additional funding for maintenance and training beyond the initial award. Although the use of follow-on funding for maintenance was approved beginning in 2007, many stakeholders nonetheless expressed concern about a potential lack of dedicated funding for maintenance and support by the 1206 authority, which, in turn, would limit the long-term sustainability of efforts to build partner-nation capacity.

46 22 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? able through other funding mechanisms, leaving partner nations frustrated when faced with uncertainty about funding from year to year. For some partner nations, 1206 Program funds represent a small percentage of overall U.S. security assistance, but for others, particularly African nations, 1206 Program funds are a primary source of security assistance. Other stakeholders noted that the 1206 Program s expedited delivery schedules have given partner nations unrealistic expectations for rapid implementation of other U.S. security assistance programs, including FMS. The Project Design, Selection, Prioritization, and Assessment Processes Are Somewhat Unclear When asked about the process for designing and selecting individual projects, a number of stakeholders indicated that U.S. security objectives are a primary consideration. Noting this as a strength, they commented that the United States has greater influence over the type of equipment and training provided through the 1206 Program as compared with other programs that accommodate partner-nation interests, such as FMS. This, in turn, allows the United States to build capabilities through the 1206 Program that support U.S. national security objectives. 3 Despite general agreement among stakeholders that 1206 Program spending directly supports U.S. national security objectives, some of those interviewed expressed confusion over the specific definition of those objectives and how, in turn, individual 1206 Program projects are selected. Some stakeholders indicated that they did not know which policy and planning office defines priorities for 1206 Program funding, nor had they seen a written articulation of those priorities. Some OUSD(P), DSCA, and COCOM stakeholders also noted that the process of project selection has changed over time, with fewer projects proposals originating from the COCOMs; however, those that are submitted are more in line with OUSD(P) guidance. 4 A small number of the stakeholders we interviewed remarked that policy planners in the Pentagon steered the 1206 Program project funding toward countries that are a high priority for policymakers. 5 Interagency Cooperation Has Improved, but Data Sharing Is Limited Nearly all stakeholders believed that communication and collaboration between DoD and DoS had improved as a result of the 1206 Program. Many noted that the dual-key nature of the 1206 Program authority requires greater interagency cooperation at various levels, and several remarked that this was, in fact, the most successful aspect of the program. Some stakeholders commented that since the initiation of the 1206 Program, they are always on the phone with their counterparts in other agencies and now meet regularly through interagency working groups. One model of improved coordination noted by a COCOM representative was the creation of a contract integrator position in the field. This contracted position is dedicated to supporting security cooperation officers in overseeing the training and equipping of programs 3 One stakeholder explained that high-end capabilities, such as the F-16, are often on the top of partner-nation shopping lists but are not as useful as helicopters for building counterterrorism capabilities. 4 After OUSD(P) issues its annual guidance for project proposals, COCOMs submit proposals for 1206 Program funds. These proposals are reviewed and prioritized in separate but parallel processes by a working group of program administrators in OASD(SO/LIC&IC) and the Bureau of Political-Military Affairs in DoS. 5 Lebanon and Yemen were two examples cited in multiple interviews.

47 Exploring the Options for 1206 Program Assessment: Discussions with Key Stakeholders 23 in various partner nations. The contract integrator helps ensure communication between the facilitators of different 1206 Program projects and between the COCOM, the services, and DoS. Still, many stakeholders expressed a need for better coordination in the contracting process and greater transparency in program reporting. For example, COCOMs are rarely notified when the equipment they have requested has been shipped, which has occasionally resulted in the mistaken return of unclaimed items. For its part, OUSD(P) is rarely notified whether equipment or training has been delivered, let alone informed as to whether a given project has met its stated objectives. Several stakeholders noted the need for better coordination as an issue that negatively affects current efforts to conduct assessments. When asked whether their individual offices conducted any form of internal assessment, several OUSD(P), DSCA, COCOM, and service representatives indicated that they had created informal systems to track outputs or process milestones within their areas of responsibility (AORs). These assessments (or data collection initiatives, as we would classify them) tended to focus on monitoring internal departmental procedures rather than evaluating project outcomes. Some stakeholders noted that recent efforts have been made to improve data sharing through conferences, workshops, and data portals; however, there was no formal mechanism for coordinating these ad hoc processes. Current data collection requirements to the extent that they exist do not require 1206 Program stakeholders to share any of the data that they collect with other stakeholders, contributing to a lack of transparency among participating offices. Stakeholder Observations Relative to Insights of Assessment Although most stakeholders shared a general perception that execution of the 1206 Program has improved over time, there was also broad recognition of the need for a more formal assessment process both to substantiate the continued need for the program and to identify ways to improve its execution. The study team asked stakeholders to share their thoughts on what information a future program-wide assessment might capture and which stakeholders would be best suited to provide the necessary data. Although interviewers did not introduce the framework described in Chapter Two, we have structured the subjects responses to illustrate the insights derived from the assessment framework. Guidance on Assessment Is Needed Stakeholders agreed on the urgent need for a comprehensive, multilayered assessment of the 1206 Program. Most of those interviewed were aware of (and many had contributed to) reviews of the 1206 Program conducted by congressional delegations or external research agencies, such as GAO, the Congressional Research Service, and the Center for Naval Analyses, as well as the interagency evaluation conducted by the DoD and DoS Inspectors General. However, OUSD(P), DSCA, congressional, and COCOM representatives noted that most of the published reports focus on a small number of high-profile projects in Lebanon, Pakistan, Sierra Leone, and Yemen and thus do not constitute a program-wide assessment. Similarly, congressional reports provide snap assessments and anecdotal information, but they are based on qualitative observations rather than persistent data collection.

48 24 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Stakeholders recognized that establishing a comprehensive assessment structure would be a complex process. They noted that assessments need to achieve several objectives: to measure the success of the 1206 Program authority in achieving national security priorities to assess the quality and timeliness of the implementation and execution of 1206 Program processes and activities to measure the impact of the individual projects that make up the larger authority. Such an assessment would require input from all stakeholders, as well as formal guidance on objectives, assessment roles, data collection, and coordination between agencies. Measurable Objectives Are Required on Multiple Levels A number of stakeholders commented on the complexity involved in creating metrics for success for the 1206 Program. COCOM, congressional, OUSD(P), and DSCA representatives pointed out the inherent difficulty of measuring the success of counterterrorism measures, which are largely preventative actions. COCOM stakeholders also mentioned that there may be barriers to tracking partner-nation activity (including partner-nation secrecy), as well as in evaluating the sustainability and long-term strategic impact of specific 1206 Program projects. Many therefore observed that measures of effectiveness would need to be tailored to each stakeholder engaged in the different stages of the 1206 Program, and they offered a number of suggestions for the various types of objectives that might be included in future assessments. Such suggestions included counting the number of engagements in which a particular 1206 Program funded capability is employed, evaluating whether a country is willing to accept additional training, and determining the extent to which an urgent or emergent threat to the United States has been addressed. Stakeholders from OUSD(P), DSCA, the COCOMs, and the services provided examples of baseline data that would be required to determine whether the program is meeting its immediate goals. Sources included checklists as to timeliness and quality of equipment delivered or training executed, as well as reviews to determine whether partners are using the equipment and training for their intended purpose and how often (for example, the frequency of use of specific equipment against a target or whether the partner nation has used its new competency in counterterrorism missions). In some cases, it might be possible to measure the number of bad guys killed or captured, the number of terrorist operations disrupted, or the number of weapons caches recovered. In the case of training for stability operations, it would be possible to determine whether a partner nation was able to deploy on time as planned and whether it used its new competency during that deployment. For specific missions or skills, such as improvised explosive device detection or forensic exploitation, it could be possible to track the number of missions from which the team returned without any casualties. Some COCOM stakeholders noted that obtaining data on partner-nation military activities can be difficult in regions in which U.S. forces are not engaged (and therefore do not have visibility in the field) or where local sovereignty is contested. It is often difficult, for example, to determine how a nation is using radar equipment provided by the United States or whether that radar contributes to counterterrorism missions. Stakeholders also provided suggestions for measuring the indirect effects of the train and equip program, such as whether a partner nation is more willing to engage with the United

49 Exploring the Options for 1206 Program Assessment: Discussions with Key Stakeholders 25 States as a result of participation in the 1206 Program or whether its operational relationship with the United States has improved. This could be determined by evaluating whether a country has become more willing to accept training or conduct U.S.-supported counterterrorism or stability operations because of participation in the 1206 Program. Stakeholders commented that such information is not easy to track or quantify. One can measure longer-term impacts by the extent to which a partner nation is willing to sustain its new capabilities (as demonstrated by committing its own funds for maintenance and spare parts, for example) and the extent to which 1206 Program funded equipment or training is integrated into a partner nation s military force. Some stakeholders commented on the importance of considering whether a partner nation has adjusted its doctrine to include the related mission. One COCOM representative suggested evaluating 1206 Program projects on how well they integrate with other security assistance and development efforts undertaken by DoD, DoS, and nongovernmental organizations. Many stakeholders suggested that strategic-level measures be included, such as the extent to which 1206 Program projects address urgent or emergent threats identified in the program submissions. For example, Has a fire been put out? Has there been a reduction in the operating space of a threat? Has there has been a strategic shift as a result of the 1206 Program project? We assert that these examples would also be notably difficult to quantify and to apply consistently across geographic regions Program Assessment Roles and Responsibilities Will Require Clarification Some of the informal assessment mechanisms that have been developed by the various offices engaged in 1206 Program projects could form the basis for a robust assessment process. Stakeholders indicated the potential utility of using existing data on proposal tracking, project milestones, equipment deliveries, and partner-nation capabilities in support of a more formal, program-wide assessment. However, the roles and responsibilities for these assessments would need to be expanded and defined more clearly. Stakeholders across all offices recognized the need for comprehensive, program-wide assessments, yet nearly all of those interviewed expressed a concern about the lack of time available to assume the new responsibilities that such an assessment might require. The implementation of the 1206 Program itself has created additional work for security assistance officers, and assessments, in particular, take a great deal of time. OUSD(P), DSCA, service, and COCOM representatives commented that any new assessment responsibilities given to current stakeholders would need to be sufficiently flexible to avoid generating additional stress for the staff involved. Although some stakeholders indicated that the COCOMs and country teams would have a primary role in conducting individual project assessments, most agreed that the responsibility for strategic assessment should be widely shared among the various offices engaged in the program. Several stakeholders also expressed concern about the lack of training in either current procedures or a future assessment process. OUSD(P), DSCA, and service stakeholders, in particular, commented that few of those engaged in train and equip programs had specific training in security assistance or assessment processes, while others indicated that any future assessment responsibilities would require additional personnel training.

50 26 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? New Data Collection and Reporting Requirements Are Needed Stakeholders acknowledged that new guidelines on data collection and reporting requirements would be necessary to begin conducting a more comprehensive program assessment. Noting that although current data collection occurs on an ad hoc basis, OUSD(P), DSCA, the COCOMs, and country team officials do collect data regularly. A number of stakeholders also mentioned the importance of developing a means of capturing the anecdotal information obtained in the field. Although anecdotes are not easily quantifiable, such information is critical to assessing partner-nation capabilities and willingness to engage in counterterrorism and stability operations. COCOM, DoS, congressional, OUSD(P), and DSCA stakeholders also emphasized the need to maintain consistent data collection structures over the course of several years to determine the long-term impact of 1206 Program projects. They stated that new reporting requirements would be necessary to ensure the sharing of data between agencies in a timely manner. Improving Coordination with Key Agencies Is Critical The creation of a consistent feedback loop would be helpful in implementing an effective assessment process, and stakeholders noted the need to improve information flows between agencies. One stakeholder commented that inefficient feedback loops perpetuate information asymmetry. Providing a means for sharing data on a regular and timely basis would not only allow improve the quality of program assessments but would ultimately enhance program planning and execution. Conclusion In conducting the stakeholder interviews, the research team s intent was to gain a baseline understanding of the 1206 Program to facilitate the design of a formal survey to identify potential sources of data and potential assessors. The interviews with stakeholders were incredibly beneficial in clarifying the 1206 Program s current implementation processes and in identifying areas in which stakeholder views on the program differ. Indeed, the results of the formal survey, described in Chapter Four, elaborate on many of the issues raised here, particularly concerns about how a future assessment process would be incorporated into current staff responsibilities. Although the stakeholders interviewed held varying opinions about the design and implementation of the 1206 Program, all agreed that there is a need for a comprehensive assessment process. The next chapter explores the survey responses with regard to data collection and potential roles and responsibilities for assessors.

51 CHAPTER FOUR Applying the Framework to the 1206 Program: Results of the Survey Every security cooperation program has multiple stakeholders, often scattered across multiple organizations at different levels and with different priorities and perspectives. In the case of the 1206 Program, we characterize the primary stakeholders as belonging to one of three levels: OSD (including DSCA), COCOM, or the services. Within OSD, one finds overall policy development and program management. While the focal point at this level is the OASD(SO/LIC&IC) 1206 Program team, DSCA also plays a key role in implementing the program. At the COCOM level, COCOM planners and country-specific security cooperation officers design and implement 1206 Program projects and activities. Finally, at the service level, security assistance agencies, such as the Air Force Security Assistance Center, and headquarters planners oversee acquisition and training associated with the projects. Non-DoD stakeholders may also have information about 1206 Program projects and sometimes even participate in activities or provide other resources that contribute to the overall program. To understand the roles that these various stakeholders play in the 1206 Program and, more importantly, what roles they could play in project assessments, the RAND study team collected information from stakeholder representatives through an online survey conducted in August The survey was designed to gain specific insights into current and potential assessment capabilities. This chapter begins by describing the analytical approach used by the RAND team to gather and examine information regarding 1206 Program assessments. Then, it presents a set of overall findings and concludes with a discussion of the implications for 1206 Program assessments. Survey of 1206 Program Experts The team used two separate approaches to ascertain the extent to which the security cooperation assessment approach detailed in previous work for the Air Force and for OUSD(P) was feasible for employment by the 1206 Program community. In addition to the interviews described in Chapter Three, the team conducted quantitative supporting analyses based on aggregate responses to a structured survey, titled Section 1206 Global Train and Equip Request for Expert Feedback. (The survey is reproduced in full in the Appendix A.) 27

52 28 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Constructing the Survey Developing a survey that could elicit data from real stakeholders required the study team to move from the abstract nature of the evaluation hierarchy to a more concrete framework that described the program s essential parts and their relationships. To do this, the study team drew on a previously developed RAND approach, which addressed the assessment of Air Force security cooperation programs. 1 This approach resulted in a survey that asked questions about current program tasks, as opposed to theoretical assessment tasks. In addition, our survey drew on 1206 Program guidance and insights gained from interviews with various stakeholders as a way to ensure that the terminology was familiar to the respondents. The survey questions were grouped into four broad areas derived from and connected to the hierarchy of evaluation in which a stakeholder might be involved: Process Implementation. Some stakeholders carry out specific tasks as assigned by program managers. These tasks might include organizing an event or providing subjectmatter expertise, establishing contracts, accounting for funds, or processing documentation required by Air Force instructions or other directives. Process Design and Development. Other stakeholders participate in the design or development of processes, carrying out such activities as developing lesson plans, contracts, or event agendas. Program Recommendations. Some stakeholders make recommendations to program managers about the size of, scope of, or need for the program or a specific activity. Program Decisions. Still other stakeholders make decisions regarding specific activities, the need for, or the scope of the program. Questions in each category allowed us to categorize respondents and connect survey responses back to the broader assessment framework. The survey employed terms and references that were familiar to the respondents and asked concrete questions about the activities in which the respondents were currently engaged. By mapping the answers back to assessment roles, we identified where there was potential for assessment capability and where there were possible gaps. We then developed insights to inform OUSD(P) efforts to implement a comprehensive assessment framework, as described in this report. We also reviewed guidance documents to identify the types of documentation and data collection required for specific 1206 Program projects. Table 4.1 shows the types of data that could be collected and assessed across an entire program. The documents listed in the table were either specifically cited in OUSD(P) guidance or mentioned by stakeholders during our interviews. We approached the construction of these questions with the understanding that program guidance forms the basis for assessment. In other words, program guidance documents the need for a program and its objectives and often tells program managers how to design and implement their programs. Although OUSD(P) publishes formal program guidelines and lessons learned, which are distributed formally though the Joint Staff to the COCOMs, not everyone in the 1206 Program stakeholder community is aware of this guidance. Stakeholders are familiar with ed instructions from OUSD(P), however. Without formal guidance 1 See Moroney, Hogler, et al., 2010, and Marquis, Hogler, et al., 2010.

53 Applying the Framework to the 1206 Program: Results of the Survey 29 Table 4.1 Types of 1206 Program Data That Can Be Collected, by Topic Topic Type of Data Demand Acquisition of equipment, supplies, or spares Section 1206 Program proposal After-action reports Background material Bills of lading or other receipts for the delivery of goods Briefings Budget allocation memo Budget projection Budget request Contract award Contract performance Contract preparation Country desk officers Discussion with partner-nation representatives Discussions with country team members Exchange agreement Intelligence sources International agreements Lesson plans or other training material Letter of request/letter of acceptance Master data agreement Meeting or conference agenda Memorandum of agreement Memorandum of understanding Own observations/trip report Participant entry or exit testing Project agreement/ arrangement memos Project proposal Proposal development Quality assurance reports Reports to Congress Request for disclosure authorization Request for fund cite Requirements or proposal prioritization document Security plan Site survey reports Status, update, IPR, or similar briefings Training quotas Training reports Travel orders U.S. military trainers Visit request Resources Acquisition of equipment, supplies, or spares Section 1206 Program proposal Background material Briefings Budget allocation memo Budget projection Budget request Contract preparation Lesson plans or other training material Letter of request/letter of acceptance Master data agreement Meeting or conference agenda Project proposal Proposal development Request for disclosure authorization Request for fund cite Requirements or proposal prioritization document Security plan Site survey Visit request Costs 319 funds transfer After-action reports Annual report Budget allocation memo Budget projection Budget request Embassy concurrent cable FAA Section 505 agreement Financial reports Human rights vetting request Loan agreement Monthly report Own observations/trip report Periodic financial report Program proposal Programmatic review Progress report Project final report Quality assurance report Quarterly obligation report Releasability request Reports to Congress Request for fund cite Requirements or proposal prioritization document Status, update, or IPR report Test and disposition report Training report Travel orders Travel vouchers

54 30 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Table 4.1 Continued Topic Objectives After-action reports Annual report Certification to Congress Conference, subject-matter expert exchange, or roundtable discussion Contract award Contract performance Data exchange annex Delegation of disclosure authority letter Delivery of equipment Discussion with country desk officers Discussion with partner-nation representatives Embassy concurrent cable End-of-tour report Equipment installation Extended visit authorization Type of Data FAA Section 505 agreement Financial reports Guest lists Human rights vetting request Informal discussions Information exchange agreement Interim tour report International visit request Letter of acceptance Letter of request Meeting minutes/summary Monthly report Own observations/trip report Nomination package Program proposal Programmatic review Progress report Project final report Project quarterly report Quality assurance report Quarterly obligation report Quid pro quo analysis Releasability request Reports to Congress Request for foreign disclosure authorization Request for human rights vetting Requirements or proposal prioritization document Security plan Status, update, or IPR briefing Test and disposition report Training quotas Training report Travel orders or invitational travel orders Trip report Visit request NOTE: FAA = Foreign Assistance Act. IPR = interim progress review. regarding data collection and assessment, we relied on the results of discussions with 1206 Program managers and other stakeholders to develop the set of potential data sources Program Guidance While OUSD(P) and DSCA do provide guidance, it is often supplemented by stakeholders organizational handbooks, checklists, operating instructions, and other documents that capture local procedures and ensure continuity as programs change hands. COCOM project managers and service-level stakeholders may, because of their close proximity to project activities and events, have much greater insight into a program s true workings than is outlined in top-level guidance. This collective body of documentation, both formal and informal, forms a more complete source for both data collection and assessment of 1206 Program project-level activities, and, as such, it is important to understand what it comprises and how extensive it is. We asked survey respondents to comment on the availability of guidance documents, as well as to describe the documents that they develop themselves. We asked respondents specifically about formal guidance, as well as internal operating procedures and other programspecific documents that would guide the conduct of the program. Survey Respondents In close collaboration with the study sponsor, the RAND team identified key 1206 Program stakeholders who would be invited to take the survey. Approximately 40 percent of the invited participants submitted surveys pertaining to 1206 Program projects. Specifically, 56 of

55 Applying the Framework to the 1206 Program: Results of the Survey identified stakeholders responded, a very strong response rate for a survey of this kind. 2 In addition, each of the three organizational levels (OSD, DSCA, the COCOMs, and the services) are represented. Responses reported here are not attributed to specific offices or individuals. Instead, as in Chapter Three, we grouped the survey respondents according to three levels: (1) OSD/DSCA, (2) COCOM, and (3) service. By grouping the respondents in this way, we were able to maintain the respondent s privacy while gaining insight into potential 1206 Program assessment roles and an understanding of where gaps might exist. Findings and Observations Many stakeholders plan, implement, and generally support 1206 Program projects, each contributing to the program s effectiveness. By analyzing the results of the survey, we identified key areas to help strengthen the ability of OUSD(P) to assess its security cooperation programs. In this section, we describe these key areas and provide detailed examples that give insight into current gaps. Formal Guidance on the Assessment Process Is Needed Among other things, assessment at all levels requires clearly articulated objectives; without formal guidance to lay out objectives, roles and responsibilities, and reporting requirements, it will be very difficult to effectively assess the 1206 Program. The 1206 Program suffers from a perception that it is a transient initiative, which may stem from its year-to-year continuation in legislation as opposed to status as an established program of record. The short-term, gap-filling focus of the program also contributes to this perception. The purpose of the 1206 Program is both building partnership capacity and facilitating the conduct of counterterrorism and stability operations. About half of our survey respondents saw building partner capacity as the program s primary end state, while the other half emphasized the counterterrorism mission as the program s overall goal. It would be helpful to reconcile these differing ideas about priorities, sustainability, and success prior to assessment, and the most effective mechanism for doing so will likely be formal guidance. Another issue that stems from the lack of formal guidance is the disconnect between setting objectives and designing projects. Table 4.2 illustrates this problem. Despite a large number of respondents claiming that they design projects and activities, a remarkably small number admitted to setting objectives. This finding raises questions about the basis on which project designers are developing project proposals. The data suggest that OUSD(P) and DSCA are not predefining the objectives for the COCOMs, yet the COCOMs do not appear to be defining the objectives either. In the best case, as addressed in the next section, the process is simply somewhat opaque. In the worst case, there are no clear objectives being set. In either case, clear, formal guidance can help alleviate this problem. 2 See Matthias Schonlau, Ronald D. Fricker, and Marc N. Elliott, Conducting Research Surveys via and the Web, Santa Monica, Calif.: RAND Corporation, MR-1480-RC, 2002.

56 32 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Table 4.2 Disconnects Between Setting Objectives and Program Design Percentage of Survey Respondents Question 16. Do you design specific events or activities for this Section 1206 project? OSD/DSCA COCOMs Services Do you set the objectives for the specific projects? Do you set the objectives for specific events or activities within specific projects? Measurable Objectives That Explicitly Connect to Broader U.S. Government, Theater, Regional, and 1206 Project Goals Are Lacking Individual 1206 Program projects are connected to regional end states under GEF, COCOM TCPs, and each embassy s mission strategic resource plan. Country teams are required to justify these connections on the program proposal form. However, based on survey results, there appeared to be some disconnect among the survey respondents about the connection between individual programs and higher-level guidance. There also appeared to be confusion among the survey respondents about who was responsible for setting objectives for individual projects. The proposal template requires the country teams to include the project s objective (the capability shortfall to be addressed). As described earlier, the least frequently cited activity conducted by survey respondents was setting objectives for projects or specific activities. Between 0 and 12 percent of respondents claimed that they set objectives for individual projects, and only between 0 and 6 percent set such objectives for activities that were conducted in support of the projects. Without identifying a clear source of project and activity objectives, it is difficult to assign responsibility for project and activity assessment, let alone overall program assessment. We see a manifestation of this in the number of respondents who claimed to collect data regarding how well the projects were meeting objectives. Table 4.3 illustrates this point, indicating that one-fourth to nearly two-fifths of respondents stated that they gathered such data. Not understanding where objectives are being set simply raises speculation. Since substantial numbers of respondents indicated that they did in fact design programs, one conclusion might be that project objectives are being drawn from other guidance documents, such as the GEF, COCOM plans and strategies, or other, perhaps service-level, strategies. This would be the best case. On the other hand, it is not possible to confirm this assumption based on the survey results. Table 4.3 Disconnects Between Setting Objectives and Collecting Data on Their Achievement Percentage of Survey Respondents Question 25. Do you gather data that reflect how well a specific event or activity met its objectives? OSD/DSCA COCOMs Services Do you set the objectives for the specific projects? Do you set the objectives for specific events or activities within specific projects? 0 6 4

57 Applying the Framework to the 1206 Program: Results of the Survey 33 Gaps Exist in Data Collection and Reporting Requirements One of the more revealing results from the survey was that the most common data source cited by respondents, regardless of the type of data collected, was my own observations. Stakeholders who directly observe activities and events tend to be in the best position to collect the data that will be used for assessments. It can be problematic, however, when simple observations replace formal mechanisms for collecting and routing data. In the case of the 1206 Program, it appears that data collection is mostly informal, although formal mechanisms do exist for many of the routine procedural aspects, such as budgeting, contracting, and acquisition activities. Table 4.4 presents seven questions that helped clarify the types of data collected, as well as the stakeholder levels that collect them. One observation from Table 4.4 is that data collection appears to be spread out across a large number of organizations; the relatively low percentages of respondents collecting data, along with their consistency across stakeholder levels, suggests that no single office is collecting all of the data needed to conduct assessments. Two particularly noteworthy findings shown in the table reveal gaps resulting from the short-term focus of the program, as discussed earlier. First, visibility at all levels tends to focus on day-to-day processing and implementation rather than long-term objectives or outcomes. Services, for example, appear focused on routine implementation (i.e., the FMS process), while higher levels focus on meeting administrative deadlines. Question 7, for example, had a relatively high positive response rate compared with those of all the other data collection questions, ranging from 50 to 61 percent. This question, however, has to do with data regarding timeliness and accuracy, as well as appropriateness. Timeliness and accuracy are clearly short-term output objectives, and while important from a management standpoint, they reveal little about the overall effect of the program. Similarly, data regarding legal compliance is essential, but they do little to help the program manager understand the extent to which program objectives are being met. Second, the least understood aspects of the program included partner views, effects on partner capabilities, and the appropriateness of partner representatives participating in 1206 Table 4.4 Types of Assessment Data Collected by Survey Respondents Percentage of Survey Respondents Question 7. Do you collect data regarding the timeliness, accuracy, or appropriateness of the assistance provided to partner nation(s)? 9. Do you collect data regarding participant views or observations, such as exit surveys? OSD/DSCA COCOMs Services Do you collect data regarding capabilities assessments? Do you collect data regarding the effect of the 1206 project on relevant partner capabilities? Do you collect data regarding the funds expended? Do you collect data regarding compliance with legal or regulatory requirements related to the project? 21. Do you collect data regarding the cost of the overall project or the cost of a unit of output (i.e., one graduate, one event, etc.)?

58 34 How Successful Are U.S. Efforts to Build Capacity in Developing Countries? Program project-level activities. 3 In particular, according to the survey results, partner-nation views were not well understood. However, such an understanding is critical to determining assessment roles and responsibilities. Stakeholders at the COCOM level routinely interact with partners and will likely be in the best position to collect such data. To illustrate the divide between data related to short-term administrative issues and the longer-term achievement of objectives, Figures 4.1 and 4.2 depict sources used in collecting outcome assessment data and process and implementation assessment data, respectively. There is, of course, considerable overlap between the questions asked for each assessment level; as one in a series of nested assessments, each level draws from and builds on the preceding levels. To isolate the unique data sources used in each type of assessment, however, the study team identified the survey questions pertaining each. Three questions were directly relevant to process and implementation assessments: Question 7: Do you collect data regarding the timeliness, accuracy, or appropriateness of the assistance provided to partner nation(s)? Question 8: Do you collect data regarding partner-nation representatives such as attendees, participants, recipients, or numbers of individuals trained? Question 13: Do you collect data regarding compliance with legal or regulatory requirements related to the project? Each of these questions is also indirectly relevant to outcome assessments, when aggregated over time. But the team also found that three questions were of immediate relevance to outcome assessments: Question 9: Do you collect data regarding participant views or observations, such as exit surveys? Question 10: Do you collect data regarding capabilities assessments? Question 11: Do you collect data regarding the effect of the 1206 project on relevant partner capabilities? Process and implementation assessments had the widest range of data sources of any assessment level, with 20 separate data sources identified by respondents (as shown in Figure 4.1). Of the 56 survey respondents, 34 or more than 60 percent indicated that they collected this type of information already. Less well sourced were data regarding long-range outcomes, with only ten sources identified by respondents (as shown in Figure 4.2). Moreover, the actual collection of this type of data was much more limited than that for process and implementation. For example, of the 56 respondents, only 19 or just over one-third indicated that they collected this type of data. Of the 34 respondents who were collecting process and implementation data, just over half (55 percent) were also collecting outcome data. This suggests an issue that may hinder the effectiveness of assessments at both the outcome and cost-effectiveness levels. Without adequate collection of data regarding the achievement of outcome objectives, cost-effectiveness assessments, which rely on the results of each of 3 We are referring to partner-nation representatives participating in project-level activities that occur after the project has been approved and implemented. In other words, Is the partner nation sending the right people for training?

59 Applying the Framework to the 1206 Program: Results of the Survey 35 the other levels (because of the nested nature of the assessment hierarchy) will be difficult to conduct and will yield questionable results. Figure 4.1 Data Sources Identified for Process and Implementation Assessments Discussion with country desk officers 4% Travel orders 2% Training reports 2% Status, update, IPR, or similar briefings 9% Reports to Congress 2% Project proposal 6% Discussion with partnernation representatives 3% Discussion with country team members 4% Travel orders or invitational travel orders 2% Releasability requests 5% Request for human rights vetting 3% FAA Section 505 agreement 2% Request for foreign disclosure authorization 2% After-action reports 7% Bills of lading or other receipts for the delivery of goods 6% Contract award 7% Own observations/ trip report 15% Letter of acceptance Memoranda of agreement 10% 2% NOTE: Percentages do not sum to 100 due to rounding. RAND TR Contract performance 4% Figure 4.2 Data Sources Identified for Outcome Assessments Country desk officers 13% Discussion with partnernation representatives 9% Discussion with country team members 13% Intelligence sources 6% U.S. military trainers 7% Certification to Congress 4% After-action reports 12% Informal discussions 7% Own observations/trip report 22% Meeting minutes/summary 7% RAND TR

For More Information

For More Information CHILDREN AND FAMILIES EDUCATION AND THE ARTS ENERGY AND ENVIRONMENT HEALTH AND HEALTH CARE INFRASTRUCTURE AND TRANSPORTATION INTERNATIONAL AFFAIRS LAW AND BUSINESS NATIONAL SECURITY POPULATION AND AGING

More information

For More Information

For More Information CHILDREN AND FAMILIES EDUCATION AND THE ARTS ENERGY AND ENVIRONMENT HEALTH AND HEALTH CARE INFRASTRUCTURE AND TRANSPORTATION INTERNATIONAL AFFAIRS LAW AND BUSINESS NATIONAL SECURITY POPULATION AND AGING

More information

State-Level Changes in Energy Intensity and Their National Implications

State-Level Changes in Energy Intensity and Their National Implications Science and Technology Policy Institute State-Level Changes in Energy Intensity and Their National Implications MARK BERNSTEIN KATERYNA FONKYCH SAM LOEB DAVID LOUGHRAN Prepared for the U.S. Department

More information

For More Information

For More Information CHILDREN AND FAMILIES EDUCATION AND THE ARTS ENERGY AND ENVIRONMENT HEALTH AND HEALTH CARE INFRASTRUCTURE AND TRANSPORTATION INTERNATIONAL AFFAIRS LAW AND BUSINESS NATIONAL SECURITY The RAND Corporation

More information

Career Development for the Department of Defense Security Cooperation Workforce

Career Development for the Department of Defense Security Cooperation Workforce Career Development for the Department of Defense Security Cooperation Workforce M. Wade Markel, Jefferson P. Marquis, Peter Schirmer, Sean Robson, Lisa Saum-Manning, Katherine Hastings, Katharina Ley Best,

More information

Improving Strategic Decision and Senior-level Teamwork in U.S. National Security Organizations

Improving Strategic Decision and Senior-level Teamwork in U.S. National Security Organizations I N S T I T U T E F O R D E F E N S E A N A L Y S E S Improving Strategic Decision and Senior-level Teamwork in U.S. National Security Organizations James S. Thomason James N. Bexfield February 2017 Approved

More information

CHAPTER 11 PERSONNEL MANAGEMENT EVALUATION SECTION 1 - GENERAL

CHAPTER 11 PERSONNEL MANAGEMENT EVALUATION SECTION 1 - GENERAL CHAPTER 11 PERSONNEL MANAGEMENT EVALUATION SECTION 1 - GENERAL 11-1. Purpose of Personnel Management Evaluation. Evaluation is an essential component in the personnel management process. Its purpose is

More information

SEC ESTABLISHMENT OF PANEL ON CONTRACTING INTEGRITY.

SEC ESTABLISHMENT OF PANEL ON CONTRACTING INTEGRITY. SEC. 813. ESTABLISHMENT OF PANEL ON CONTRACTING INTEGRITY. (a) Establishment- (1) IN GENERAL- The Secretary of Defense shall establish a panel to be known as the `Panel on Contracting Integrity'. (2) COMPOSITION-

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 1400.25, Volume 2011 May 7, 2016 USD(P&R) SUBJECT: DoD Civilian Personnel Management System: Defense Civilian Intelligence Personnel System (DCIPS) Performance

More information

For More Information

For More Information CHILDREN AND FAMILIES EDUCATION AND THE ARTS ENERGY AND ENVIRONMENT HEALTH AND HEALTH CARE INFRASTRUCTURE AND TRANSPORTATION INTERNATIONAL AFFAIRS LAW AND BUSINESS NATIONAL SECURITY POPULATION AND AGING

More information

FEATURE ARTICLE Changes to Foreign Military Sales Administrative Surcharge Structure and Rate

FEATURE ARTICLE Changes to Foreign Military Sales Administrative Surcharge Structure and Rate FEATURE ARTICLE Changes to Foreign Military Sales Administrative Surcharge Structure and Rate By Beth M. Baker Defense Security Cooperation Agency Background On 15 March 2006, the Defense Security Cooperation

More information

KEY SUCCESS FACTORS FOR MAJOR PROGRAMS THAT LEVERAGE IT. The 7-S for Success Framework

KEY SUCCESS FACTORS FOR MAJOR PROGRAMS THAT LEVERAGE IT. The 7-S for Success Framework KEY SUCCESS FACTORS FOR MAJOR PROGRAMS THAT LEVERAGE IT The 7-S for Success Framework May 2014 This document sets forth a framework of critical success factors for large scale government IT projects. ACT-IAC

More information

Implementation Guides

Implementation Guides Implementation Guides Implementation Guides assist internal auditors in applying the Definition of Internal Auditing, the Code of Ethics, and the Standards and promoting good practices. Implementation

More information

Concept of Operations. Disaster Cycle Services Program Essentials DCS WC OPS PE

Concept of Operations. Disaster Cycle Services Program Essentials DCS WC OPS PE Concept of Operations Disaster Cycle Services Program Essentials DCS WC OPS PE October 2014 Change Log Date Page(s) Section Change Owner: Disaster Cycle Services 2 Change Log... 2 Introduction... 4 Purpose...

More information

Final Report Evaluation of Translation Bureau Programs Volume 2: Translation and Other Linguistic Services Program

Final Report Evaluation of Translation Bureau Programs Volume 2: Translation and Other Linguistic Services Program 2012-603 Evaluation of Translation Bureau Programs Office of Audit and Evaluation January 21, 2014 Table of Contents MAIN POINTS... i INTRODUCTION... 1 PROFILE... 1 Background... 1 Authority... 2 Roles

More information

National Defense University. Strategic Plan 2012/2013 to 2017/18 One University Evolution

National Defense University. Strategic Plan 2012/2013 to 2017/18 One University Evolution National Defense University The Chairman s University: Inspiring Creative, Critical and Collaborative Thinkers For Leadership Through Academic Excellence Strategic Plan 2012/2013 to 2017/18 One University

More information

Enterprise Architectures

Enterprise Architectures Enterprise Architectures A Just-in-Time Approach for Decision-Making Carlos E. Martinez The MITRE Corporation 7515 Colshire Drive McLean, Virginia 22102 March 24, 2014 Approved for Public Release; Distribution

More information

ADMINISTRATIVE INSTRUCTION 40 EMPLOYEE LEARNING AND DEVELOPMENT

ADMINISTRATIVE INSTRUCTION 40 EMPLOYEE LEARNING AND DEVELOPMENT ADMINISTRATIVE INSTRUCTION 40 EMPLOYEE LEARNING AND DEVELOPMENT Originating Component: Office of the Deputy Chief Management Officer of the Department of Defense Effective: July 19, 2017 Releasability:

More information

The World Bank Audit Firm Assessment Questionnaire

The World Bank Audit Firm Assessment Questionnaire The World Bank Audit Firm Assessment Questionnaire Assessment of audit firms in the Africa Region Background The Bank s financial management Bank Procedures (BP) and Operations Policy (OP) (BP/OP 10.00)

More information

Defense Civilian Intelligence Personnel System Performance Management

Defense Civilian Intelligence Personnel System Performance Management Department of the Army Volume 2011 Defense Civilian Intelligence Personnel System Performance Management July 2011 Department of the Army DCIPS Policy Summary of Change, July 2011 VOLUME 2011 Defense Civilian

More information

3Z0Z 0? W7? TOTAL QUALITY MANAGEMENT MASTER PLAN* (S)l\f> v^n. 0 Department of Defense. AUG 2 ego, August O/ A./ \o V* TECHNICAL LIBRARY

3Z0Z 0? W7? TOTAL QUALITY MANAGEMENT MASTER PLAN* (S)l\f> v^n. 0 Department of Defense. AUG 2 ego, August O/ A./ \o V* TECHNICAL LIBRARY 3Z0Z O/ A./ \o V* ^V TIME y>- v^n 0? W7? TOTAL QUALITY MANAGEMENT MASTER PLAN* 0 Department of Defense August 1988 (S)l\f> TECHNICAL LIBRARY AUG 2 ego, u i ACCESSION NO AD DoD TOTAL QUALITY MANAGEMENT

More information

Defense Exports Foreign Military Sales Program Needs Better Controls for Exported Items and Information for Oversight

Defense Exports Foreign Military Sales Program Needs Better Controls for Exported Items and Information for Oversight Defense Exports Foreign Military Sales Program Needs Better Controls for Exported Items and Information for Oversight By The United States Government Accountability Office [The following are excerpts from

More information

Report of the Reliability Improvement Working Group Table of Contents

Report of the Reliability Improvement Working Group Table of Contents Report of the Reliability Improvement Working Group 1942 poster commissioned by US Office for Emergency Management, Office of War Information Courtesy of the National Archives September 2008 Report of

More information

CHAPTER 1 Introduction

CHAPTER 1 Introduction CHAPTER 1 Introduction The Standard for Program Management provides guidelines for managing programs within an organization. It defines program management and related concepts, describes the program management

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE COMMANDER AIR FORCE RESEARCH LABORATORY AIR FORCE RESEARCH LABORATORY INSTRUCTION 33-401 13 MARCH 2014 Communications and Information ENTERPRISE BUSINESS INFORMATION TECHNOLOGY REQUIREMENTS

More information

ADMINISTRATIVE, EDUCATIONAL, AND STUDENT SUPPORT (AES) UNIT REVIEW GUIDE

ADMINISTRATIVE, EDUCATIONAL, AND STUDENT SUPPORT (AES) UNIT REVIEW GUIDE ADMINISTRATIVE, EDUCATIONAL, AND STUDENT SUPPORT (AES) UNIT REVIEW GUIDE Borough of Manhattan Community College makes a distinction between assessment and evaluation. Assessment is about continuous improvement

More information

Audit and Advisory Services Integrity, Innovation and Quality. Audit of Internal Controls over Financial Reporting

Audit and Advisory Services Integrity, Innovation and Quality. Audit of Internal Controls over Financial Reporting Audit and Advisory Services Integrity, Innovation and Quality Audit of Internal Controls over Financial Reporting October 2015 Table of Contents i Audit of Internal Controls over Financial Reporting EXECUTIVE

More information

PHASE TWO FOLLOW-UP REPORT ON THE AUDIT OF CONTRACTS (2008)

PHASE TWO FOLLOW-UP REPORT ON THE AUDIT OF CONTRACTS (2008) PHASE TWO FOLLOW-UP REPORT ON THE AUDIT OF CONTRACTS (2008) PREPARED BY: Government Audit Services Branch Government of Yukon APPROVED BY: Audit Committee Table of Contents Page PREFACE 3 EXECUTIVE SUMMARY

More information

Improve Your. Process. BY john dobriansky

Improve Your. Process. BY john dobriansky Improve Your Acquisition Process With Lean Six Sigma BY john dobriansky 14 Contract Management May 2008 While Lean Six Sigma can be applied to numerous categories of contracting, this article examines

More information

Practice Guide. Developing the Internal Audit Strategic Plan

Practice Guide. Developing the Internal Audit Strategic Plan Practice Guide Developing the Internal Audit Strategic Plan JUly 2012 Table of Contents Executive Summary... 1 Introduction... 2 Strategic Plan Definition and Development... 2 Review of Strategic Plan...

More information

Hierarchical Authority Job Specialization Formalized Rules Chain of command, whereby the officials and units have control over those below them

Hierarchical Authority Job Specialization Formalized Rules Chain of command, whereby the officials and units have control over those below them AP Civics Chapter 13 Notes The Federal Bureaucracy: Administering the Government I. Introduction The National Performance Review (NPR) led by Vice President Al Gore recommended 384 ways of improving government

More information

The Role of the Ministerial Advisor in Security Sector Reform:

The Role of the Ministerial Advisor in Security Sector Reform: The Role of the Ministerial Advisor in Security Sector Reform: Navigating Institutional Terrains LIZ PANARELLI USIP April 2009 UNITED STATES INSTITUTE OF PEACE 1200 17th Street NW, Suite 200 Washington,

More information

Implementing Category Management for Common Goods and Services

Implementing Category Management for Common Goods and Services Implementing Category Management for Common Goods and Services Darbi Dillon Office of Federal Procurement Policy 1800 G Street NW, Washington DC 20006 Audit Tax Advisory Grant Thornton LLP 333 John Carlyle

More information

Tax increment financing (TIF) is a tool used by cities and other development

Tax increment financing (TIF) is a tool used by cities and other development MINNESOTA OFFICE OF THE LEGISLATIVE AUDITOR Tax Increment Financing EXECUTIVE SUMMARY Tax increment financing (TIF) is a tool used by cities and other development authorities to finance certain types of

More information

Rigor and Objectivity in T&E

Rigor and Objectivity in T&E Guest Editorial ITEA Journal 2010; 31: 161 164 Rigor and Objectivity in T&E J. Michael Gilmore, Ph.D. Director, Operational Test and Evaluation, Office of the Secretary of Defense, Washington, DC The Director

More information

JOHN BASCHAB JON PIOT

JOHN BASCHAB JON PIOT T H E PROFESSIONAL SERVICES FIRM BIBLE JOHN BASCHAB JON PIOT John Wiley & Sons, Inc. T H E PROFESSIONAL SERVICES FIRM BIBLE T H E PROFESSIONAL SERVICES FIRM BIBLE JOHN BASCHAB JON PIOT John Wiley & Sons,

More information

Strengthening Organizational Health and Performance in Government

Strengthening Organizational Health and Performance in Government A Working Paper by the Standing Panel on Executive Organization and Management NATIONAL ACADEMY OF PUBLIC ADMINISTRATION January 2018 Strengthening Organizational Health and Performance in Government Accompanying

More information

Delivering on Digital Government

Delivering on Digital Government Underwritten by: Delivering on Digital Government A Candid Survey of Federal Managers April 2015 Purpose Driven by the White House s 2012 Digital Government Strategy, agencies have begun implementing and

More information

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS NUMBER 3 2 ND EDITION, 2010 PERFORMANCE MONITORING & EVALUATION TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues

More information

A Visualization and Decision-Support Tool for Homeland Security Risk Prioritization

A Visualization and Decision-Support Tool for Homeland Security Risk Prioritization Research Synopsis A Visualization and Decision-Support Tool for Homeland Security Risk Prioritization Modeling Area: Risk Assessment Case Studies Supported: Risk-Based Resource Allocation Principal Investigator:

More information

Implementing a Data Collection Tool to Estimate and Analyze Spending ICEAA Conference June 2013

Implementing a Data Collection Tool to Estimate and Analyze Spending ICEAA Conference June 2013 1 Implementing a Data Collection Tool to Estimate and Analyze Spending ICEAA Conference June 2013 2 Agenda Current Client Environment Developing and Implementing a Data Collection Tool Background on TECOM

More information

National Water Resources Planning Professional Certification, Planning Community of Practice (PCoP) Version 1.0

National Water Resources Planning Professional Certification, Planning Community of Practice (PCoP) Version 1.0 National Water Resources Planning Professional Certification, Planning Community of Practice (PCoP) Version 1.0 Table of Contents Contents 1.0 Purpose... 1 2.0 Applicability.... 2 3.0 References.... 2

More information

CONCLUSIONS APPLYING LESSONS LEARNED. Chapter Four

CONCLUSIONS APPLYING LESSONS LEARNED. Chapter Four Chapter Four CONCLUSIONS Competitive sourcing appears to generate personnel cost savings: In this study, the expected annual personnel cost savings ranged between 34 percent and 59 percent of baseline

More information

Appropriate Role of Contractors Supporting the Government Working Group: Presentation of Recommendations 12/16/2005

Appropriate Role of Contractors Supporting the Government Working Group: Presentation of Recommendations 12/16/2005 PRELIMINARY WORKING GROUP DRAFT For Discussion Purposes Only Not Reviewed or Approved by the Acquisition Advisory Panel Appropriate Role of Contractors Supporting the Government Working Group: Presentation

More information

Research Services For Parliamentary Committees

Research Services For Parliamentary Committees UNIT 4: Research Services For Parliamentary Committees Learning Objectives How does parliamentary staff know what they know? After studying this unit you should: Be able to recognize the role of research

More information

Identifying Governance Best Practices in Systems-of-Systems Acquisition

Identifying Governance Best Practices in Systems-of-Systems Acquisition Calhoun: The NPS Institutional Archive DSpace Repository Reports and Technical Reports All Technical Reports Collection 2014-05-15 Identifying Governance Best Practices in Systems-of-Systems Acquisition

More information

GAO DEFENSE INFRASTRUCTURE. DOD Needs to Take Actions to Address Challenges in Meeting Federal Renewable Energy Goals

GAO DEFENSE INFRASTRUCTURE. DOD Needs to Take Actions to Address Challenges in Meeting Federal Renewable Energy Goals GAO December 2009 United States Government Accountability Office Report to the Subcommittee on Readiness, Committee on Armed Services, House of Representatives DEFENSE INFRASTRUCTURE DOD Needs to Take

More information

Top 5 Systems Engineering Issues within DOD and Defense Industry

Top 5 Systems Engineering Issues within DOD and Defense Industry Top 5 Systems Engineering Issues within DOD and Defense Industry Task Report July 26-27, 27, 2006 1 Task Description Identify Top 5 Systems Engineering problems or issues prevalent within the defense industry

More information

Business Planning and Governance for Corporate Training

Business Planning and Governance for Corporate Training Business Planning and Governance for Corporate Training Josh Bersin Principal Analyst May 2008 This report has been excerpted from The High-Impact Learning Organization: WhatWorks in the Management, Governance

More information

LESSONS LEARNED: 13 MONTHS AS THE SENIOR MILITARY ADVISOR TO THE MINISTER OF INTERIOR. Colonel Kevin J. Palgutt Senior Advisor, NTM-A/CSTC-A

LESSONS LEARNED: 13 MONTHS AS THE SENIOR MILITARY ADVISOR TO THE MINISTER OF INTERIOR. Colonel Kevin J. Palgutt Senior Advisor, NTM-A/CSTC-A LESSONS LEARNED: 13 MONTHS AS THE SENIOR MILITARY ADVISOR TO THE MINISTER OF INTERIOR Colonel Kevin J. Palgutt Senior Advisor, NTM-A/CSTC-A September 3, 2010 As I am spending my last night here at Camp

More information

Improving Weapon System Investment Decisions A Knowledge-based Approach to Weapon System Acquisitions Could Improve Outcomes

Improving Weapon System Investment Decisions A Knowledge-based Approach to Weapon System Acquisitions Could Improve Outcomes Improving Weapon System Investment Decisions A Knowledge-based Approach to Weapon System Acquisitions Could Improve Outcomes Travis Masters Senior Defense Analyst U.S. Government Accountability Office

More information

For More Information

For More Information Center for Domestic and International Health Security A RAND HEALTH PROGRAM THE ARTS CHILD POLICY CIVIL JUSTICE EDUCATION ENERGY AND ENVIRONMENT This PDF document was made available from www.rand.org as

More information

Program Management Office Effectiveness. The Key to Leading Reform Success. Meeting the Requirements of OMB Memorandum M-17-22

Program Management Office Effectiveness. The Key to Leading Reform Success. Meeting the Requirements of OMB Memorandum M-17-22 Program Management Office Effectiveness The Key to Leading Reform Success Meeting the Requirements of OMB Memorandum M-17-22 September 2017 This page left intentionally blank. Program Management Office

More information

REQUEST FOR PROPOSALS EXECUTIVE SEARCH SERVICES. Issuance Date November 27, 2017

REQUEST FOR PROPOSALS EXECUTIVE SEARCH SERVICES. Issuance Date November 27, 2017 REQUEST FOR PROPOSALS EXECUTIVE SEARCH SERVICES Issuance Date November 27, 2017 Submittal Deadline January 8, 2018 Page Is Blank For Photocopying MARIN LAFCO 2 TABLE OF CONTENTS Section Page Number I.

More information

CEF. Cumulative Effects Framework. Interim Policy. for the Natural Resource Sector. October Cumulative Effects Framework

CEF. Cumulative Effects Framework. Interim Policy. for the Natural Resource Sector. October Cumulative Effects Framework CEF Cumulative Effects Framework Cumulative Effects Framework Interim Policy for the Natural Resource Sector October 2016 Policy Approval The Cumulative Effects Framework Interim Policy is approved for

More information

Evaluation, Evaluators, and the American Evaluation Association

Evaluation, Evaluators, and the American Evaluation Association Evaluation, Evaluators, and the American Evaluation Association What is evaluation? Evaluation is a field that applies systematic inquiry to help improve programs, products, and personnel, as well as the

More information

REPORT 2015/024 INTERNAL AUDIT DIVISION. Audit of the Office of the United Nations High Commissioner for Human Rights Country Office in Guatemala

REPORT 2015/024 INTERNAL AUDIT DIVISION. Audit of the Office of the United Nations High Commissioner for Human Rights Country Office in Guatemala INTERNAL AUDIT DIVISION REPORT 2015/024 Audit of the Office of the United Nations High Commissioner for Human Rights Country Office in Guatemala Overall results relating to the management of operations

More information

ELDORADO GOLD CORPORATION BOARD OF DIRECTORS TERMS OF REFERENCE

ELDORADO GOLD CORPORATION BOARD OF DIRECTORS TERMS OF REFERENCE ELDORADO GOLD CORPORATION BOARD OF DIRECTORS TERMS OF REFERENCE I. ROLES AND RESPONSIBILITIES The principal role of the Board of Directors ( Board ) is stewardship of Eldorado Gold Corporation (the Company

More information

Army Systems Engineering Career Development Model

Army Systems Engineering Career Development Model Army Systems Engineering Career Development Model Final Technical Report SERC-2014-TR-042-2 March 28, 2014 Principal Investigators Dr. Val Gavito, Stevens Institute of Technology Dr. Michael Pennotti,

More information

STRATEGIC PLANNING FOR PUBLIC AND NONPROFIT ORGANIZATIONS

STRATEGIC PLANNING FOR PUBLIC AND NONPROFIT ORGANIZATIONS Y STRATEGIC PLANNING FOR PUBLIC AND NONPROFIT ORGANIZATIONS A Guide to Strengthening and Sustaining Organizational Achievement THIRD EDITION John M. Bryson More Praise for Bryson s Strategic Planning

More information

Visionary Leadership. Systems Perspective. Student-Centered Excellence

Visionary Leadership. Systems Perspective. Student-Centered Excellence Core Values and Concepts These beliefs and behaviors are embedded in high-performing organizations. They are the foundation for integrating key performance and operational requirements within a results-oriented

More information

ROCKY MOUNTAIN PERFORMANCE EXCELLENCE POSITION DESCRIPTION FOR THE EXECUTIVE/MANAGING DIRECTOR

ROCKY MOUNTAIN PERFORMANCE EXCELLENCE POSITION DESCRIPTION FOR THE EXECUTIVE/MANAGING DIRECTOR ROCKY MOUNTAIN PERFORMANCE EXCELLENCE POSITION DESCRIPTION FOR THE EXECUTIVE/MANAGING DIRECTOR ABOUT THE ORGANIZATION: Rocky Mountain Performance Excellence (RMPEx) is a Colorado nonprofit corporation

More information

Executive Summary. xiii

Executive Summary. xiii Executive Summary Growth is good for the poor, but the impact of growth on poverty reduction depends on both the pace and the pattern of growth. A pattern of growth that enhances the ability of poor women

More information

FAA PMIWDC LUNCHEON SERIES STRATEGIC PLANNING; WHAT, WHY, AND HOW

FAA PMIWDC LUNCHEON SERIES STRATEGIC PLANNING; WHAT, WHY, AND HOW FAA PMIWDC LUNCHEON SERIES STRATEGIC PLANNING; WHAT, WHY, AND HOW John Lever, Managing g Partner The Lever Group February 29 th, 2012 vision mission strategy performance INTRODUCTION AND SESSION OVERVIEW

More information

Siuslaw valley Fire Rescue & Western Lane Ambulance. SHARED ADMINISTRATIVE SERVICES Proposed Implementation Plan

Siuslaw valley Fire Rescue & Western Lane Ambulance. SHARED ADMINISTRATIVE SERVICES Proposed Implementation Plan Siuslaw valley Fire Rescue & Western Lane Ambulance SHARED ADMINISTRATIVE SERVICES Proposed Implementation Plan June 2016 Table of Contents Purpose 3 Strategic Planning 4 Format 5 Direction and Oversight

More information

UNITED STATES MARINE CORPS MARINE CORPS BASE 3250 CATLIN AVENUE QUANTICO VIRGINIA IN REPLY REFER TO: MCBO 5200.

UNITED STATES MARINE CORPS MARINE CORPS BASE 3250 CATLIN AVENUE QUANTICO VIRGINIA IN REPLY REFER TO: MCBO 5200. UNITED STATES MARINE CORPS MARINE CORPS BASE 3250 CATLIN AVENUE QUANTICO VIRGINIA 22134 5001 IN REPLY REFER TO: MCBO 5200.3 B 64 MARINE CORPS BASE ORDER 5200.3 From: Commander To: Distribution List Subj:

More information

SUMMARY INTRODUCTION. added. xiii. 1 GEN Dennis Reimer, Army Chief of Staff, Army Magazine, October Emphasis

SUMMARY INTRODUCTION. added. xiii. 1 GEN Dennis Reimer, Army Chief of Staff, Army Magazine, October Emphasis SUMMARY There will not be a revolution in military affairs unless there is a revolution in logistics. This means putting our faith in concepts like velocity management and total asset visibility, giving

More information

Mid-America Intergovernmental Audit Forum. Selecting an External Auditor. Guide for Making a Sound Decision

Mid-America Intergovernmental Audit Forum. Selecting an External Auditor. Guide for Making a Sound Decision Mid-America Intergovernmental Audit Forum Selecting an External Auditor Guide for Making a Sound Decision May 2007 Foreword The benefits of having a high-quality audit of a government's financial statements

More information

Actionable enterprise architecture management

Actionable enterprise architecture management Enterprise architecture White paper June 2009 Actionable enterprise architecture management Jim Amsden, solution architect, Rational software, IBM Software Group Andrew Jensen, senior product marketing

More information

Work plan for enhancing the management and administration of UNCTAD

Work plan for enhancing the management and administration of UNCTAD Distr.: Restricted 7 September 2012 English only Trade and Development Board Fifty-ninth session Geneva, 17 28 September 2012 Item 12 of the provisional agenda Matters requiring action by the Board in

More information

Evaluation Policy for GEF Funded Projects

Evaluation Policy for GEF Funded Projects Evaluation Policy for GEF Funded Projects Context Conservation International helps society adopt the conservation of nature as the foundation of development. We do this to measurably improve and sustain

More information

Executive Summary. Baseline/Implementation Report. DOD Civilian Acquisition Workforce Personnel Demonstration Project. August 2000

Executive Summary. Baseline/Implementation Report. DOD Civilian Acquisition Workforce Personnel Demonstration Project. August 2000 August 2000 Delivery Order: DABT65-00-F-0109 DOD Civilian Acquisition Workforce Personnel Demonstration Project Executive Summary Prepared for: OSD/ PO 2001 N. Beauregard Street Suite 750 Alexandria, VA

More information

Standard on Assurance Engagements ASAE 3500 Performance Engagements

Standard on Assurance Engagements ASAE 3500 Performance Engagements ASAE 3500 (July 2008) (Amended October 2008) Standard on Assurance Engagements ASAE 3500 Issued by the Auditing and Assurance Standards Board Obtaining a Copy of this Standard on Assurance Engagements

More information

Army DCIPS. Year-End Performance Evaluation Guide

Army DCIPS. Year-End Performance Evaluation Guide Army DCIPS Year-End Performance Evaluation Guide Updated August 2017 TABLE OF CONTENTS I. Reference 3 II. Overview 4 III. Understanding the Performance Evaluation of Record 5 IV. The Performance Evaluation

More information

Chapter 4 - Recommendations for an Enhanced Enterprise Information Technology Governance Structure

Chapter 4 - Recommendations for an Enhanced Enterprise Information Technology Governance Structure Enterprise IT governance in a state government context is best seen as an evolving process, responding to new technological capabilities, organizational practices, and dynamic political environments. Designing

More information

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date:

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date: DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date: 20130524 AMSC Number: N9379 Limitation: N/A DTIC Applicable: N/A GIDEP Applicable: N/A Office of Primary Responsibility:

More information

Standardization Template

Standardization Template Standardization Template I. PURPOSE: This template helps the user to make an informed standardization decision by assessing: (1) standardization opportunities, (2) standardization decision processes, and

More information

Applying PSM to Enterprise Measurement

Applying PSM to Enterprise Measurement Applying PSM to Enterprise Measurement Technical Report Prepared for U.S. Army TACOM by David Card and Robert MacIver Software Productivity Consortium March 2003 SOFTWARE PRODUCTIVITY CONSORTIUM Applying

More information

Army Strategic Software Improvement Program (ASSIP) Survey of Army Acquisition Program Management

Army Strategic Software Improvement Program (ASSIP) Survey of Army Acquisition Program Management Army Strategic Software Improvement Program (ASSIP) Survey of Army Acquisition Program Management Results of the 2003 Survey of Army Acquisition Managers Mark Kasunic March 2004 TECHNICAL REPORT CMU/SEI-2004-TR-003

More information

Latest Reliability Growth Policies, Practices, and Theories for Improved Execution

Latest Reliability Growth Policies, Practices, and Theories for Improved Execution Latest Reliability Growth Policies, Practices, and Theories for Improved Execution Lou Gullo Raytheon Missile Systems Senior Principal Engineer March 14, 2012 Copyright 2012 Raytheon Company. All rights

More information

SUBJECT: Better Buying Power 2.0: Continuing the Pursuit for Greater Efficiency and Productivity in Defense Spending

SUBJECT: Better Buying Power 2.0: Continuing the Pursuit for Greater Efficiency and Productivity in Defense Spending THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC 20301-3010 ACQUISITION, TECHNOLOGY AND LOGISTICS MEMORANDUM FOR DEFENSE ACQUISITION WORKFORCE SUBJECT: Better Buying Power 2.0: Continuing

More information

CORROSION MANAGEMENT MATURITY MODEL

CORROSION MANAGEMENT MATURITY MODEL CORROSION MANAGEMENT MATURITY MODEL CMMM Model Definition AUTHOR Jeff Varney Executive Director APQC Page 1 of 35 TABLE OF CONTENTS OVERVIEW... 5 I. INTRODUCTION... 6 1.1 The Need... 6 1.2 The Corrosion

More information

ENGINEERING INNOVATIVE PRODUCTS

ENGINEERING INNOVATIVE PRODUCTS ENGINEERING INNOVATIVE PRODUCTS ENGINEERING INNOVATIVE PRODUCTS A PRACTICAL EXPERIENCE Edited by Roger Woods Karen Rafferty Julian Murphy School of Electronics, Electrical Engineering and Computer Science,

More information

RECORDS MANAGEMENT FOR BUILDING CLOSURE PROJECTS: FEDERAL FACILITY BEST PRACTICES. W. David Featherman Project Performance Corporation

RECORDS MANAGEMENT FOR BUILDING CLOSURE PROJECTS: FEDERAL FACILITY BEST PRACTICES. W. David Featherman Project Performance Corporation RECORDS MANAGEMENT FOR BUILDING CLOSURE PROJECTS: FEDERAL FACILITY BEST PRACTICES W. David Featherman Project Performance Corporation ABSTRACT This paper presents the results of a benchmarking analysis

More information

Board of Trustees Self Evaluation 2012

Board of Trustees Self Evaluation 2012 2013 Board of Trustees Self Evaluation Research and Institutional Effectiveness Report 2013-04 Research and Institutional Effectiveness RIE Table of Contents Introduction... 1 Board Self Evaluation Overview...

More information

AUDIT ADVANTAGE: IMPROVING MISSION READINESS

AUDIT ADVANTAGE: IMPROVING MISSION READINESS AUDIT ADVANTAGE: IMPROVING MISSION READINESS AUDIT ADVANTAGE: IMPROVING MISSION READINESS THE CHALLENGE: MOVING FROM AUDIT READINESS TO SUSTAINED AUDIT SUCCESS Everything you have heard about the size

More information

J. Gordon Seymour, Secretary Martin F. Baumann, Chief Auditor and Director of Professional Standards

J. Gordon Seymour, Secretary Martin F. Baumann, Chief Auditor and Director of Professional Standards Eaton Corporation 1111 Superior Avenue Cleveland, OH 44114-2584 tel: 216-523-4336 fax: 216-479-7336 Richard H. Fearon Vice Chairman and Chief Financial and Planning Officer December 13, 2011 Office of

More information

FIAT CHRYSLER AUTOMOBILES N.V. AUDIT COMMITTEE CHARTER

FIAT CHRYSLER AUTOMOBILES N.V. AUDIT COMMITTEE CHARTER FIAT CHRYSLER AUTOMOBILES N.V. AUDIT COMMITTEE CHARTER For so long as shares of Fiat Chrysler Automobiles N.V. (the Company ) are listed on the New York Stock Exchange ( NYSE ), the rules of the NYSE and

More information

Do Performance Reforms Change How Federal Managers Manage? Donald Moynihan and Stéphane Lavertu

Do Performance Reforms Change How Federal Managers Manage? Donald Moynihan and Stéphane Lavertu Number 52 October 2012 Do Performance Reforms Change How Federal Managers Manage? Donald Moynihan and Stéphane Lavertu EXECUTIVE SUMMARY Donald Moynihan is a Professor of Public Affairs at La Follette

More information

National Guiding Principles and Standards for M&E of Public Sector Policies & Programmes in South Africa

National Guiding Principles and Standards for M&E of Public Sector Policies & Programmes in South Africa By Prof Levin, DG DPSA 19 June 2006 National Guiding Principles and Standards for M&E of Public Sector Policies & Programmes in South Africa Monitoring & Impact Assessment Seminar 19-23 June 2006 The Farm

More information

Civilian Personnel CORPORATE RECRUITMENT AND SELECTION

Civilian Personnel CORPORATE RECRUITMENT AND SELECTION DEPARTMENT OF THE ARMY U.S. Army Corps of Engineers CEHR-E Washington, DC 20314-1000 Regulation No. 690-1-1203 ER 690-1-1203 1 August 2001 Civilian Personnel CORPORATE RECRUITMENT AND SELECTION Supplementation

More information

Document Revision History

Document Revision History Document Revision History Version Date Author Change Description 1.0 July 2015 Capt. Christina Felkins Initial creation 2.0 November 2015 Diana Lamonica 1 st Revision Strategic Goals and Communication

More information

Logistics Management Support Annex

Logistics Management Support Annex Logistics Management Support Annex Federal Logistics Partners: Department of Agriculture, Forest Service Department of Defense, U.S. Army Corps of Engineers Department of Homeland Security Department of

More information

Chapter 10 Crown Corporation Governance

Chapter 10 Crown Corporation Governance Crown Corporation Governance Contents Background............................................................. 123 Scope................................................................... 124 Results in

More information

THE DEPARTMENT OF DEFENSE AND THE UNITED STATES ENVIRONMENTAL PROTECTION AGENCY

THE DEPARTMENT OF DEFENSE AND THE UNITED STATES ENVIRONMENTAL PROTECTION AGENCY THE DEPARTMENT OF DEFENSE AND THE UNITED STATES ENVIRONMENTAL PROTECTION AGENCY WASHINGTON, DC SUBJECT: DoD/EPA Joint Guidance on Site Closeout and NPL Deletion Process For DoD Facilities Attached is guidance

More information

King III Chapter 2 Board Charter. September 2009

King III Chapter 2 Board Charter. September 2009 Chapter 2 Board Charter September 2009 The information contained in this Practice Note is of a general nature and is not intended to address the circumstances of any particular individual or entity. The

More information

A Primer for the Project Management Process by David W. Larsen 1. Table of Contents

A Primer for the Project Management Process by David W. Larsen 1. Table of Contents A Primer for the Project Management Process by David W. Larsen 1 Table of Contents Description... 2 STAGE/STEP/TASK SUMMARY LIST... 3 Project Initiation 3 Project Control 4 Project Closure 6 Project Initiation...

More information

Program Sustainability Workbook 2015

Program Sustainability Workbook 2015 Program Sustainability Workbook 2015 Table of Contents I. Introduction Sustainability Framework... 3 II. Program Summary... 5 III. Action Plan for Priority Elements A. Self-Assessment Tool... 6-14 B. Current

More information

Article from: CompAct. April 2013 Issue No. 47

Article from: CompAct. April 2013 Issue No. 47 Article from: CompAct April 2013 Issue No. 47 Overview of Programmatic Framework and Key Considerations Key elements Description Items to consider Definition and identification of EUCs The statement that

More information

POSSE System Review. January 30, Office of the City Auditor 1200, Scotia Place, Tower Jasper Avenue Edmonton, Alberta T5J 3R8

POSSE System Review. January 30, Office of the City Auditor 1200, Scotia Place, Tower Jasper Avenue Edmonton, Alberta T5J 3R8 1200, Scotia Place, Tower 1 10060 Jasper Avenue Edmonton, Alberta T5J 3R8 edmonton.ca/auditor POSSE System Review January 30, 2017 The conducted this project in accordance with the International Standards

More information