Defense Logistics Agency Instruction. Information Technology (IT) Benchmarking

Similar documents
Defense Logistics Agency INSTRUCTION

Defense Logistics Agency Instruction. Enterprise License Strategy/Process

DOD INSTRUCTION BUSINESS SYSTEMS REQUIREMENTS AND ACQUISITION

INFORMATION ASSURANCE DIRECTORATE

D E F E N S E S T A N D A R D I Z A T I O N P R O G R A M O F F I C E

ATARC Supplier Relationship Management (SRM), IT Category Management & Mobility Across Federal Government

Inspector General FOR OFFICIAL USE ONLY

CERT Resilience Management Model, Version 1.2

Department of Defense Fiscal Year (FY) 2016 President's Budget Request Defense Contract Audit Agency Overview

DoD DMSMS Strategic Objectives Update. April 26, DoD DMSMS Lead

Standard Operating Procedure For NWS s Economy Act Orders

ecap Health Check Evaluation

DOD MANUAL , VOLUME 12 DOD SUPPLY CHAIN MATERIEL MANAGEMENT PROCEDURES: SALES AND OPERATIONS PLANNING

NGB-J6/CIO CNGBI DISTRIBUTION: A 13 August 2012 NATIONAL GUARD BUREAU (NGB) JOINT INFORMATION TECHNOLOGY PORTFOLIO MANAGEMENT

Top 10 Signs You're Ready (or Not)

Cintipation Corp. CORINTHIAN PROJECT Policy for Requirements Management

a GAO GAO DEFENSE INVENTORY Several Actions Are Needed to Further DLA s Efforts to Mitigate Shortages of Critical Parts

Improve GRC Maturity through Combined Assurance

Given design trade studies, interpret the results to recommend the most cost-effective concept for development.

2018 CMMI Institute 1

DEFENSE LOGISTICS AGENCY HEADQUARTERS 8725 JOHN J. KINGMAN ROAD FORT BELVOIR, VIRGINIA

Oversight Review March 16, External Quality Control Review of the Defense Logistics Agency Audit Organization. Report No.

SWEN 256 Software Process & Project Management

1.0 PART THREE: Work Plan and IV&V Methodology

Defense Business Systems - Examples

Executive Summary THE OFFICE OF THE INTERNAL AUDITOR. Committee Meeting, June 19, 2017 Board of Governors Meeting, June 20, 2017.

Using Pilots to Assess the Value and Approach of CMMI Implementation

From Dictionary.com. Risk: Exposure to the chance of injury or loss; a hazard or dangerous chance

REGISTERED CANDIDATE AUDITOR (RCA) TECHNICAL COMPETENCE REQUIREMENTS

Objective Demonstration of Process Maturity Through Measures

Internal Audit Policy and Procedures Internal Audit Charter

DEFENSE LOGISTICS AGENCY HEADQUARTERS 8725 JOHN J. KINGMAN ROAD FORT BELVOIR, VIRGINIA

DEFENSE LOGISTICS AGENCY HEADQUARTERS 8725 JOHN J. KINGMAN ROAD FORT BELVOIR, VIRGINIA

DEFENSE CONTRACT MANAGEMENT AGENCY Department of Defense INSTRUCTION. Strategic Planning

AUDITING. Auditing PAGE 1

CMMI for Acquisition Quick Reference

Getting from Here (SW-CMM) to There (CMMI) in a Large Organization

Getting from Here (SW-CMM) to There (CMMI) in a Large Organization

Control Environment Toolkit: Internal Audit Function

A Guide to IT Risk Assessment for Financial Institutions. March 2, 2011

Using Metrics to Improve Your Third-Party Risk Management Program

THE PUBLIC SECTOR COMPLIANCE FRAMEWORK A BRIEF OVERVIEW D A R Y L G L A S S

RESEARCH REPORT. Includes complete survey data. Project Management Maturity & Value Benchmark

How to prepare for a DOE EVMS Certification

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

DEPARTMENT OF DEFENSE HANDBOOK ACQUISITION OF SOFTWARE ENVIRONMENTS AND SUPPORT SOFTWARE

Department of Defense DIRECTIVE. SUBJECT: DoD Replenishment Parts Purchase or Borrow Program

DEPARTMENT OF DEFENSE Defense Commissary Agency Fort Lee, VA DIRECTIVE. Internal Audit Activities

Defense Logistics Agency Workforce Development Culture Doing Business With DLA

BUSINESS CPA EXAM REVIEW V 3.0. For Exams Scheduled After March 31, 2017

a GAO GAO DEFENSE MANAGEMENT Tools for Measuring and Managing Defense Agency Performance Could Be Strengthened

Kentucky State University Office of Internal Audit

MEMORANDUM FOR SUPPLY PROCESS REVIEW COMMITTEE (PRC) MEMBERS

DEFENSE LOGISTICS AGENCY HEADQUARTERS 8725 JOHN J. KINGMAN ROAD FORT BELVOIR, VIRGINIA

THE UNDER SECRETARY OF DEFENSE DEFENSE PENTAGON WASHINGTON, DC

Internal Oversight Division. Audit Report. Audit of Enterprise Risk Management

Self Assessment Workbook

ADMINISTRATIVE INTERNAL AUDIT Board of Trustees Approval: 03/10/2004 CHAPTER 1 Date of Last Cabinet Review: 04/07/2017 POLICY 3.

Strategic Plan Accomplishments

Defense Pricing and Contracting Acquisition Exchange Program. Project Description #1

CFPB Compliance Management Review

SE Effectiveness Leading Indicators. Garry Roedler

Partnering for Warfighter Readiness

2017 Archaeology Audit Program Procedure Manual. April 2017

THE ROLE OF MODEL VERIFICATION IN MODEL RISK MANAGEMENT

Inspector General: Inspections, Assessments, and Evaluations

Follow-up Audit of the CNSC Performance Measurement and Reporting Frameworks, November 2011

TABLE OF CONTENTS. Appendix A DoDEA UFC High Performance Sustainable Building Compliance Checklist...9. Version 2.0 August2017 Page 2

The Information Technology Management Reform Act. Clinger-Cohen Act of 1996

ISA 201 Intermediate Information System Acquisition

Fair Trade USA ISEAL Impact Code Public Systems Report

Compliance Operations Draft Reliability Standard Compliance Guidance for PER July 1, 2013

TERMS OF REFERENCE. For INTERNATIONAL LEADERSHIP DEVELOPMENT PROGRAMME (ILDP)

DEFENSE LOGISTICS AGENCY AMERICA S COMBAT LOGISTICS SUPPORT AGENCY

CORROSION MANAGEMENT MATURITY MODEL

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

2005 SECRETARY OF DEFENSE ENVIRONMENTAL AWARDS DEFENSE LOGISTICS AGENCY DEFENSE NATIONAL STOCKPILE CENTER ENVIRONMENTAL QUALITY TEAM AWARD.

Modernization and Migration Management (M3) Playbook GSA, Unified Shared Services Management

DCMA Instruction Stewardship

Overview & Opportunities

FLORIDA DEPARTMENT OF JUVENILE JUSTICE PROCEDURE. Title: Information Technology (IT) Project Management and Governance Procedures

Five Guiding Principles of a Successful Center of Excellence

CIMA. The future of business.

Evaluating CSIRT Operations

Cabinet Office Mandate for Change Project Requirements for Portfolio, Programme and Project Management Maturity Model (P3M3 ) Revisions

Defense Procurement & Acquisition Policy Acquisition Exchange Program. Project Description #1

DEFENSE LOGISTICS AGENCY HEADQUARTERS 8725 JOHN J. KINGMAN ROAD FORT BELVOIR, VIRGINIA

DEFENSE LOGISTICS AGENCY HEADQUARTERS 8725 JOHN J. KINGMAN ROAD FORT BELVOIR, VIRGINIA

Ensuring Supply Chain Performance and Agility During a Period of Integration and Change

Case Study. Technical Talent Management

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Distance Support NAVY Overview

DATA ITEM DESCRIPTION

Defense Logistics Agency INSTRUCTION

Guideline on Good Pharmacovigilance Practices Module V- Pharmacovigilance System Master File

Developing an Integrated Anti-Fraud, Compliance, and Ethics Program

a GAO GAO INFORMATION TECHNOLOGY Defense Information Systems Agency Can Improve Investment Planning and Management Controls

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

ETASS II SKILL LEVEL AND LABOR CATEGORY DESCRIPTIONS. Skill Levels

Analysis of Alternatives from a Cost Estimating Perspective. Kirby Hom and Kenny Ragland

Transcription:

Defense Logistics Agency Instruction DLAI 6605 Effective December 2, 2011 J65 Information Technology (IT) Benchmarking References: Refer to Enclosure 1. 1. PURPOSE: a. The purpose of this instruction is to establish a systematic, robust, and standard process to benchmark, assess, and improve DLA IT Enterprise operations. b. The IT benchmarking capability provides DLA Information Operations (J6) with a means to better understand the diversity and span of IT operations; increase exposure to the DLA IT framework; identify best practices and processes to be shared across the IT Enterprise; increase adherence to existing policies while gaining insights to facilitate potential strategic and operational changes; identify and evaluate risks, and provide recommendations for mitigation. c. The IT benchmarking performance is measured against a number of objectives to include: management of a structured IT process knowledge library for use by the entire J6 organization; development and management of IT benchmarks which include target practices, associated work products, and performance metrics; assessment of the J6 organizations against the IT benchmarks using a performance-based service maturity model; and the management of IT service improvement opportunities and best practices throughout J6. 2. APPLICABILITY: This Instruction applies to all J6 organizations. The J6 organizations are: Information Assurance (J61); Program Executive Office (J62); Enterprise Solutions (J64); and IT Strategy, Policy and Licensing (J65). The J6 Sites are DLA Logistics Information Service; DLA Information Operations Europe & Africa; DLA Information Operations Pacific; DLA Information Operations at Columbus; DLA Information Operations at Fort Belvoir; DLA Information Operations at New Cumberland; DLA Information Operations at Philadelphia; DLA Information Operations at Richmond; DLA Information Operations at Ogden; DLA Transaction Services; DLA Document Services; and the Network Operations and Security Center (NOSC). 3. POLICY: a. It is DLA policy that an IT benchmarking capability shall be institutionalized in order to comply with 40 USC 11315(c)(3) (Clinger Cohen Act) which stipulates the effective and efficient design and operation of all information resources management processes, including 1

improvements to work processes, and the monitoring of IT by evaluating performance on the basis of the applicable performance measurements. Additionally, IT benchmarking complies with Department of Defense (DOD) Directive 8000.01, which requires the benchmarking of processes against models of excellence in order to develop, reengineer, simplify, or otherwise improve processes. b. It is DLA policy that J6 organizations will be periodically assessed for compliance with Federal laws, DOD regulations, and DLA policies. Additionally, these J6 organizations will be assessed against internal IT benchmarks, which include standard processes, work products, and associated performance metrics, in order to improve DLA IT services. The execution and management of the IT benchmarking capability demonstrates compliance with the Government Performance and Results Act which advocates recurring assessment, through objective measurement and systematic analysis, of the manner and extent to which programs achieve intended objectives. c. In accordance with DOD Directive 5000.01, which calls for the adoption of innovative practices, including best commercial practices, J6 leadership shall leverage the results of IT benchmarking efforts to continually improve IT services. These efforts include implementation of identified IT service improvement opportunities and the adoption of identified best practices throughout J6. d. In support of IT benchmarking events, the J6 leadership shall ensure all requested information and documentation is complete, accurate, and made available within specified timeframes. e. J6 shall establish and maintain a structured and searchable IT process knowledge library that will serve as the authoritative source for J6 IT benchmarks, processes and associated performance metrics, evidentiary artifacts, IT benchmark reports, best practice candidates, IT maturity baselines, and IT service inventory. The library is available on eworkplace at https://eworkplace.dla.mil/sites/org/j6/j65/j652/pages/default.aspx. f. J6 organizations shall ensure that organizational documentation and artifacts, available in the IT process knowledge library, are maintained and reflect current status. g. J6 organizations shall ensure that IT services being performed are accurately identified, categorized, and reflect the current performance metrics and target measures as documented in the IT service inventory. h. To support service improvement efforts, J6 organizations shall review, plan, and implement the service improvement opportunities that result from IT benchmarking events. i. J6 organizations shall support the development and implementation of identified best practices by providing information and documentation in accordance with the IT best practice candidate process described in the DLA Information Operations IT Benchmarking Standard Operating Procedure (SOP) available on eworkplace at https://eworkplace.dla.mil/sites/org/j6/j65/j652/pages/default.aspx. 2

4. RESPONSIBILITIES: a. Performance Optimization, J652 Provides policy, guidance, and oversight functions in support of performance and process improvement objectives established for J6. Maintains overall responsibility for the IT benchmarking process and policy within J6. Responsible for facilitating IT benchmarking efforts throughout J6. Identifies the J6 organization(s) that will be the subject of an IT benchmarking effort. Plans and executes all IT benchmarking events. Establishes an IT maturity baseline for the subject J6 organization from which improvement will be measured using subsequent IT Benchmarking results. Actively manages the IT service improvement opportunities and best practice candidates resulting from IT benchmarking efforts. Maintains IT process knowledge library functionality, the IT Benchmarking SOP, and IT Benchmarking Instruction (DLAI 6605). b. Organization Director Provides a primary point of contact (POC) with whom the IT Benchmarking Team can interface. Ensures full support of the IT benchmarking event by J6 organization staff members. Responsible for participating in scheduled IT benchmarking briefings. Ensures assigned IT service improvement opportunities are addressed and resolved appropriately. Ensures thorough and timely submission of information pertaining to best practice candidates originating with the subject J6 organization. Ensures organizational documentation and artifacts, available via the IT process knowledge library, are maintained as current. c. Organization POC Ensures the pre-assessment questionnaire is completed and returned to the IT Benchmarking Team by the requested completion date. Provides the IT Benchmarking Team with all requested information, documentation, and logistical support as required during each phase of the IT benchmarking event. d. Questionnaire Respondent(s) Responsible for providing information, documentation, and evidentiary artifacts for assigned questions within the pre-assessment questionnaire. Participates in scheduled interviews during which J652 will solicit answers to the follow-up questions included in the IT benchmarking visit interview guide. The questionnaire respondents are responsible for being fully prepared with appropriate responses prior to arriving at the interview. 5. PROCEDURES: The IT benchmarking process involves three phases: pre-assessment, assessment, and post-assessment. Each phase consists of a number of supporting activities or sub-processes. For detailed information concerning IT benchmarking procedures, refer to the DLA Information Operations IT Benchmarking SOP available on eworkplace at https://eworkplace.dla.mil/sites/org/j6/j65/j652/pages/default.aspx. a. Pre-Assessment Phase: J652 will develop a plan of action(s) and associated target completion dates for the various tasks involved in all three phases of the IT benchmarking event. J652 communicates the dates associated with the IT benchmarking effort to the Organization Director, requests a pre-assessment questionnaire be completed, and an organization POC be identified. After the Organization POC submits the completed preassessment questionnaire, J652 evaluates the responses and associated documentation to identify those responses that will require follow-up during the IT benchmarking visit. J652 will 3

develop an interview guide, based upon the outcome of the pre-assessment evaluation, and will provide this guide to the Organization POC. b. Assessment Phase: This phase consists of the IT benchmarking visit to, or VTC with, the subject J6 organization. The visit normally lasts 3 1/2 days. The visit begins with an inbrief delivered by J652. The majority of time is devoted to completing interviews with those Questionnaire Respondents identified in advance. Salient discussion points resulting from these interview sessions will be recorded and may become the basis for official observations to be included in the outbriefing and final report. The visit concludes with an outbriefing presentation delivered by J652 outlining the observations made during the IT benchmark visit. c. Post-Assessment Phase: J652 prepares a draft IT benchmark report containing all observations captured throughout all phases of the IT benchmarking effort. The draft report is provided to the Organization Director for review and comment. All comments received are evaluated by J652, and modifications made to the report as appropriate prior to issuing the final report. After the final report is issued, J652 conducts an in-process review with the J6 senior leadership presenting significant outcomes associated with the IT benchmarking event. These outcomes include all IT service improvement opportunities, and best practice candidates identified during the IT benchmarking effort, as well as a comprehensive IT service maturity baseline for the organization. The J652 staff will actively manage all IT service improvement opportunities through closure. Similarly, J652 will subsequently manage the approval and coordination of the best practice candidates throughout J6. 6. RELEASABILITY: UNCLASSIFIED. For Public Release. 7. EFFECTIVE DATE: This Instruction is effective immediately. Director, DLA Strategic Plans and Policy December 2, 2011 2 Enclosures Enclosure 1 - References Enclosure 2 - IT Benchmarking Process Flow 4

Enclosure 1 References 1. This DLA Instruction supersedes DLA Instruction 6605, IT Performance Assessment, dated June 20, 2005. 2. 40 USC Section 11315(c)(3) (aka Clinger Cohen Act of 1996) 3. Government Performance and Results Act of 1993 (GPRA) 4. DOD Directive 8000.01, Management of the Department of Defense Information Enterprise, dated February 10, 2009 5. DOD Directive 5000.01, The Defense Acquisition System, certified current November 20, 2007 6. DLA Information Operations IT Benchmarking SOP, December 13, 2010 5

Enclosure 2 IT Benchmarking Process Flow 6