Evaluating Agile Effectiveness (AE) of Organizations and Programs

Similar documents
CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

YaSM and the YaSM Process Map. Introduction to YaSM Service Management

PMP TRAINING COURSE CONTENT

RSA ARCHER IT & SECURITY RISK MANAGEMENT

Chapter 6. Software Quality Management & Estimation

SOLUTION BRIEF RSA ARCHER PUBLIC SECTOR SOLUTIONS

LIST OF TABLES. Table Applicable BSS RMF Documents...3. Table BSS Component Service Requirements... 13

10 metrics for improving the level of management. Pekka Forselius, Senior Advisor, FiSMA ry Risto Nevalainen, Senior Advisor, FiSMA ry

1.0 PART THREE: Work Plan and IV&V Methodology

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

Moving to the AS9100:2016 series. Transition Guide

DORNERWORKS QUALITY SYSTEM

SWEN 256 Software Process & Project Management

The CMMI Value Proposition

Capabilities Overview

Scrum Alliance Certified Team Coach SM (CTC) Application SAMPLE

agilesem an agile System Development Method at Siemens in CEE Eva Kišoňová, Ralph Miarka SW Quality Days Vienna January 2012

A Measurement Approach Integrating ISO 15939, CMMI and the ISBSG

Creating an Agile PMO via Scrum

PROJECT MANAGEMENT. Quality Management (QM) Course 7 Project Management Knowledge Areas (5) Risk Management (RM)

The keys to sustainable pricing execution include a comprehensive

May 2018 Latest update. ISO/IEC Understanding the requirements of ISO/IEC :2011 and ISO/IEC FDIS

Session 11E Adopting Agile Ground Software Development. Supannika Mobasser The Aerospace Corporation

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

ICMI PROFESSIONAL CERTIFICATION

Moving to the AS/EN 9100:2016 series. Transition Guide

May 2018 Latest update. ISO/IEC Understanding the requirements of ISO/IEC :2011 and ISO/IEC FDIS

CGEIT Certification Job Practice

Certified Team Coach (SA-CTC) Application - SAMPLE

Understanding Model Representations and Levels: What Do They Mean?

Information Technology Independent Verification and Validation

PART THREE - WORK PLAN AND IV&V METHODOLOGY WORK PLAN. FL IT IV&V Work Plan

Moving from ISO/TS 16949:2009 to IATF 16949:2016. Transition Guide

CMMI for Services Quick Reference

Managing a Project and Keeping Sane While Wrestling Elegantly With PMBOK, Scrum and CMMI (Together or Any Combination)

SOLUTION BRIEF HELPING PREPARE FOR RISK ASSESSMENT & COMPLIANCE CHALLENGES FOR GDPR WITH RSA SECURITY ADDRESSING THE TICKING CLOCK OF GDPR COMPLIANCE

A Model for CAS Self Assessment

Certification Candidates Examination Guide

3 PART THREE: WORK PLAN AND IV&V METHODOLOGY (SECTION 5.3.3)

How I Learned to Stop Worrying and Love Benchmarking Functional Verification!

6/29/ Professor Lili Saghafi

Model-Based Design Maturity: Benchmarking the Automotive Industry Vinod Reddy Manager, Consulting Services

AGENCY FOR STATE TECHNOLOGY

Certified Enterprise Coach (CEC) Application - SAMPLE

9100 revision Changes presentation clause-by-clause. IAQG 9100 Team November 2016

ISO 9001:2015. October 5 th, Brad Fischer.

2009 NASCIO Nomination IT Project and Portfolio Management

Introduction to the Testing Maturity Model Enhanced TM (TMMe)

5 CFR AND DODI V431 EXCERPTS. (Lesson 6 Evaluating Performance)

Process Improvement. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 28 Slide 1

White Paper. Evolution of Process Framework at Cybage with Emerging Trends

ISO Your implementation guide

AVEPOINT CLIENT SERVICES

Version 1.0. The Contract Management Standard Final Edition. Version 1.0

Competency Area: Business Continuity and Information Assurance

Journey to Excellence

The 9 knowledge Areas and the 42 Processes Based on the PMBoK 4th

ISACA Systems Implementation Assurance February 2009

Frameworks - Which one should you choose?

Case Study: Applying Agile Software Practices to Systems Engineering

Guidance Note: Corporate Governance - Board of Directors. January Ce document est aussi disponible en français.

Summary of 47 project management processes (PMBOK Guide, 5 th edition, 2013)

Digital & Technology Solutions Specialist Integrated Degree Apprenticeship (Level 7)

Construction Project Management Training Curriculum Integration Map

Quality 24 Process Improvement 26 Real processes. Product Quality. Quality Management. Quality Management. Quality Plan

ACCENTURE INTRODUCTION

ENTERPRISE RISK MANAGEMENT SURVEY RIMS Enterprise Risk Management (ERM) Survey SPONSORED BY:

Version 1.0. The Contract Management Standard Final Edition. Version 1.0

International Standards for the Professional Practice of Internal Auditing (Standards)

INTEGRATED SAFETY MANAGEMENT SYSTEM (ISMS)

WHAT TO LOOK FOR IN A CMMS/EAM SOLUTION

Wipro: Best Practices in CMM-Based Software Delivery

ICM CERTIFICATION (P) LIMITED

Software Engineering Part 2

Quality Manual. This manual complies with the requirements of the ISO 9001:2015 International Standard. AW2 Logistics, Inc Ace Industrial Dr.

Best Practices for Enterprise Agile Transformation

GDPR and Microsoft 365: Streamline your path to compliance

Microsoft Operations Framework

Benchmarking Functional Verification by Mike Bartley and Mike Benjamin, Test and Verification Solutions

ISO 9001:2015 Your implementation guide

TalentGuard Overview. The Predictive People Development Company

ISO 9001:2015 Your implementation guide

Capability Maturity Model for Software (SW-CMM )

ISO whitepaper, January Inspiring Business Confidence.

table of contents INTRODUCTION...3 CHAPTER 1: WHAT IS HITRUST?...4 CHAPTER 2: THE BENEFITS OF USING HITRUST...6

CMMI for Acquisition Quick Reference

Comparing PMBOK Guide 4 th Edition, PMBOK Guide 5 th Edition, and ISO 21500

Project Remedies Solution Set s Ability to Transform your IT Organization. A Selection of Use Cases from Project Remedies Inc.

QUALITY MANUAL. Origination Date: XXXX. Latest Revision Date. Revision Orig

Agile Projects 7. Agile Project Management 21

An Overview of Software Process

Bootstrapping Process Improvement Metrics: CMMI Level 4 Process Improvement Metrics in a Level 3 World

Risk Based Thinking & QMS Risk Management as per ISO

Lecture 2: Software Quality Factors, Models and Standards. Software Quality Assurance (INSE 6260/4-UU) Winter 2016

PMO QUICK TIP GUIDE FOR ESTABLISHING, SUSTAINING, AND ADVANCING YOUR PMO. Quick Tip Guide compliments of PMO Symposium San Diego, California, USA

CA Technologies Increases Customer Satisfaction and Repeat Business by Scaling Agile for Mainframe Development

ETASS II SKILL LEVEL AND LABOR CATEGORY DESCRIPTIONS. Skill Levels

Transcription:

Evaluating Agile Effectiveness (AE) of Organizations and Programs An Integrated Computer Solutions, Inc., Whitepaper on considerations for evaluating Agile Effectiveness Thomas Brazil, Chief Digital Officer, Integrated Computer Solutions Dr. Benjamin Willett, Lead Software Engineer, Integrated Computer Solutions 30 June 2017

Introduction Agile software development is a methodology that emphasizes cross-functional teams interacting in close collaboration to produce working software. On occasion, it may seem disorganized and haphazard, making an effective evaluation of an offeror s Agile effectiveness (AE) difficult. In this paper, we provide clear prescriptions for the kinds of information evaluators should seek from proposals, and how that information can be used to determine an offeror s AE. Evaluating Organizational Culture When discussing how to evaluate AE, the technical aspects of the organization would seem to take precedence over human factors such as culture. Quite the opposite is true, since we can see from the Agile manifesto [1]: Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan only one of the preferred items ( Working Software ) involves the technical aspects. The remainder are human concerns, and as such, we must begin with Organizational Culture. Organizational Certifications The two most important organizational certifications that are beneficial to Agile programs are CMMI Development and ISO 9001. CMMI/Dev The Capability Maturity Model Integration for Development (CMMI/Dev) is a collection of best practices that help organizations improve their processes. [2] It provides a comprehensive integrated set of guidelines for developing products and services. [2] CMMI/Dev focuses your organization on a set of process areas that are developed and tuned to increase value delivered to the customer. The five maturity levels of CMMI/Dev are: 1 (Initial), 2 (Managed), 3 (Defined), 4 (Quantitatively Managed) and 5 (Optimizing). The differences in CMMI/Dev process areas is shown in Table 1. Maturity Level 2 Maturity Level 3 Maturity Level 4 Maturity Level 5 Project Monitoring and Control Project Monitoring and Control Project Monitoring and Control Project Monitoring and Control Project Planning Project Planning Project Planning Project Planning Configuration Configuration Configuration Configuration

Measurement and Analysis Process and Product Quality Assurance Measurement and Analysis Measurement and Analysis Measurement and Analysis Process and Product Process and Product Process and Product Quality Assurance Quality Assurance Quality Assurance Organizational Process Organizational Process Organizational Process Definition Definition Definition Organizational Process Organizational Process Organizational Process Focus Focus Focus Organizational Training Organizational Training Organizational Training Integrated Project Integrated Project Integrated Project Risk Risk Risk Decision Analysis and Decision Analysis and Decision Analysis and Resolution Resolution Resolution Product Integration Product Integration Product Integration Development Development Development Technical Solution Technical Solution Technical Solution Validation Validation Validation Verification Verification Verification Organizational Process Organizational Process Performance Performance Quantitative Project Quantitative Project Organizational Performance Causal Analysis and Resolution Table 1 As can be seen, the additional investment in going from maturity level 3 to 4 is the addition of two process areas, likewise in going from 4 to 5. As can be seen in Figure 1 from the CMMI Institute, from 2007-2016, 68% of all CMMI-appraised organizations are appraised at maturity level 3. [3] Due to the various DoD software development processes that explicitly require Verification and Validation (V&V) (e.g. DoDI 5000.61, DISA SERM), as well as the preponderance of appraised organizations, CMMI Maturity Level 3 should be the DoD s preferred appraisal level. Additionally, evaluators should look for how programs integrated Agile with their CMMI/Dev practices by explicitly requiring them to describe which process areas would be involved in solving a hypothetical Agile problem.

[3] Figure 1 ISO 9001:2015 The Quality management standard of ISO 9001 [4] is based on principles described in ISO 9000. These principles are: customer focus leadership engagement of people process approach improvement evidence-based decision making relationship management ISO 9001 uses the Deming Cycle or Plan, Do, Check, Act (PDCA) cycle, to ensure an organizational commitment to quality and organizational improvement:

Figure 2 ISO 9001 is perfectly suited to help ensure high-quality, consistent outcomes for customers of certified services organizations, such as those that deliver Agile development services especially when innovative practices are involved. Clause 4 of the standard focuses on risk-based thinking, which helps offeror s to address opportunities and threats in today s fast-changing technical environment. Clause 5 addresses setting objectives compatible with the strategic direction and context of the organization to ensure a customer focus, and Clause 6 addresses innovation through the focus on continual improvement. Accordingly, it is our belief that ISO 9001 will grow in importance for both industry and government services contracting, as it ensures that the contracting services company has an organizational dedication to quality and consistent outcomes across the entire organization including leadership. ISO 9001 should be a required DoD standard for ensuring government and taxpayer receive high-quality services and deliverables in a consistent manner. At a minimum, proposal evaluators should consider ISO 9001 certification when evaluating Agile services proposals as it increases the likelihood of acceptable to superior performance from awardees.

Organizational Metrics With the CMMI/Dev process areas outlined in Table 1, it becomes apparent how metrics are crucial for the successful implementation of Agile by a program. The Measurement and Analysis process area develop[s] and sustain[s] a measurement capability used to support management information needs. [2]. An evaluator will have two fundamental concerns when assessing a proposal: what did you measure, and does the evidence you provide for improvements align with what was measured? Evaluators must take great care to understand the metrics measured by an offeror, and whether those metrics actually justify the AE that the offeror claims. For example, the following scenario demonstrates metrics that seem to show improvement, but in fact mask unrelated causes. Suppose a program reported a 150% increase in development throughput during the base year of a contract, and this is quantified by number of lines of code (LOC). The offeror clearly must provide a mechanism by which they measured LOC, be it a code scanner, automated process, or manual review. Without this mechanism, the improvement suggestion should be regarded as suspect. More fundamentally, though, even if the offeror does provide a mechanism, it must be clearly understood the reason for the metric improvement. Some legitimate reasons for improvement include bona-fide developer improvement (developers getting better with a language), operational improvements so that developers aren t constantly pulled away from projects, and improved testing performance so developers aren t fixing old bugs. Possible illegitimate examples are staff augmentation (adding 1 more developer to a 2-developer team at more cost) or scope reduction. These reasons conceal the true origin of the improvement. Organizational Framework One aspect of the technical solution that approaches the organizational level is the organization-wide enablement of technical efficiency. By this, we mean how the offeror provides infrastructure and processes that can be utilized by, and tailored to, customer needs. An evaluator should examine the infrastructure available to the offeror, and its reliability, integrity, and adaptability. Ask the offeror: Have they implemented robust infrastructure and processes? Would a failure of the offeror s infrastructure lead to impairment of the mission? Does the offeror include security and insider-threat concerns in its infrastructure and policies? Is the offeror forwardlooking, do they adopt new technologies with ease? Agile Team Structure & Incentives Organizational structure plays a crucial part in the effective implementation and utilization of Agile. Organizations can be broadly divided into two categories: hierarchical and cooperative. According to Koch, hierarchical organizations have communication paths that are clearly defined and restrictive and communication across the hierarchy, or skipping over levels of the hierarchy, is generally seen as a breach of etiquette and may even be considered insubordination. [1] By contrast, in cooperative organizations, work patterns and communications [ ] are fluid. [1] While it may seem that cooperative is more conducive to Agile development, because Agile contains both hierarchical and cooperative components adopting an Agile method could require your organization to stretch in either the hierarchical or the cooperative direction, or possibly in both. [1]

Evaluators should examine whether the offeror has identified its specific organizational structure, and how that structure was changed by the adoption of Agile. Additionally, other team factors such as size, colocation, criticality of projects, turnover, and multiple team implementation should be examined by an evaluator. [1] At the heart of Agile work is the development team itself. Agile teams are intended to be self-organizing, with little direct involvement from management, except when blocker removal is required. Teams organize themselves in a variety of ways and use differing methodologies, including but not limited to SCRUM, Extreme Programming (XP), Pair Programming, etc. Evaluators should ask the offeror to clearly describe what methodology was used for team organization, and how that methodology interacted with management and the customer, and what benefit was derived. Finally, Agile team members incentives should be examined by an evaluator. Agile, by construction, is a dynamic organizational process which provides great fluidity and opportunity for technical professionals. This may be hard for teams and management to adapt to. The adoption of an Agile method will significantly alter the behaviors required by your employees. [1] An evaluator should be cognizant of how the offeror identified favorable and unfavorable behaviors, and how the incentive package and reward system was tailored to maximize team productivity. For example, this can be accomplished by examining the dynamics of new teams, or teams with turnover. Observing whether the incentive structure is correlated with improvements in performance can provide evidence for the incentive package s effectiveness. Approach to Innovation Often it occurs during the period of performance that innovative work must be done to satisfy the customer s requirements. Offeror s with robust and formal approaches to innovation are more capable at delivering innovative solutions than those without. As an example, organizations with robust innovation programs are oriented to continuously look for ways to reduce cost, increase quality/functionality and increase velocity in a manner that is correlated to strategic objectives of the customers they service. This requires understanding the strategic objectives of their customers, how to build and balance their innovation portfolio according to those objectives and types of innovation, how to perform research in an efficient manner that eliminates non-intelligent innovation failures, how to generate insight using one of several innovation ideation methodologies, how to develop prototypes using a rigorous stage gate process that ensures efficiency and reduces financial risk, and how to measure and assess their efforts to ensure continual improvement. An evaluator should examine the offeror s approach to innovation, from both a cultural and experiential frame of reference. How does the approach classify different innovative projects? Does the offeror measure its innovation effectiveness, and provide feedback to improve the process? What are some of the offeror s examples of innovation in Agile programs that have increased efficiency for the team and reduced costs for the customer? Has the offeror integrated Agile into its approach to Innovation? [5]

Evaluating Past Performance With the Organizational Culture defined, the second aspect to evaluating Agile programs is examining the past performance records of the organization. Evaluators should insist that past performance be supportable by documented evidence including, but not limited to, CPARS, contract deliverables, and status reports (monthly, quarterly, etc.). Number and Types of Agile Programs The size and scope of an organization s Agile efforts demonstrates the maturity of their processes and procedures. Having a central framework to manage hundreds of Agile development projects simultaneously has more capability than an ad-hoc SCRUM team. Evaluators should obtain information on the size and scope of an offeror s Agile programs, as well as what contract types those were executed under (e.g. firm-fixed price, cost plus fixed fee), as well as evidence that the programs were managed effectively given those contract types. From a technical perspective, evaluators should examine the number and types of computing environments managed by the offeror (e.g. Development, Test, Production), as well as the infrastructure types (e.g. on-premises, Government cloud, commercial cloud, hybrid, etc.). While the Agile process and team structure are important to determining the efficiency and effectiveness of Agile execution, ultimately the most important component is delivery of capability. Evaluators should obtain evidence of major capability deliveries, how they were conducted within the Agile framework, and lessons learned by the offeror. Was our Agile approach effective? Did we have to deliver partial capability? Were there any external hindrances? Evaluators should examine how the offeror performed in the following domains: schedule, budget, quality, cycle time, productivity, and volatility utilizing the organizational metrics defined and highlighted above. The offeror s metrics should be sufficient and consistent enough to report on all the categories above. From a Cyber Security perspective, DoD is in the midst of a migration of the information assurance framework from DIACAP to RMF. Possessing experience in RMF (e.g. obtaining ATOs, categorizing systems, complying with controls) has a premium in today s environment, particularly with the everincreasing risk of cyber attacks. Evaluators should obtain and validate evidence of the offeror s experience with RMF. Evaluating Technical Methodologies With Organizational Culture and Past Performance identified and analyzed, the final step is to determine the Organization s technical capabilities. The details of the technical solution are unique to each problem; however, a robust technical framework can allow organizations to provide many different technical approaches very effectively.

Tailoring of Technical Framework This is accomplished by borrowing the notion of a tailored approach from CMMI. While CMMI tailors processes and procedures to customer needs, we can do the same for the technical approach. Evaluators should ask and obtain detailed descriptions of what are the offeror s core technical competencies, and how were those tailored to customer needs. Additionally, as requirements change, the offeror should be able to demonstrate how the tailored solution adapted. Most evaluation processes place a heavy emphasis on tools and programming languages, to ensure that the technical stack and labor expertise are aligned with the requirements. Evaluators should be familiar with alternative tools and languages that are similar enough to the need to enable easy adaptation, but also should require the proposer to explain how such adaptation will be achieved in a timely manner. Automation and Configuration Automation of technical processes enables increases in speed, efficiency and decreases in humaninduced errors. Having automated technical processes (code builds, installations, upgrades, provisioning, etc.) allow organizations to perform more consistent work with less capital investment. Evaluators should determine the offeror s automation capabilities, their ability to tailor those capabilities, and evidence of how the capabilities benefited previous customers. In a similar way, Configuration (CM) provides consistency between disparate components of the information system, and also decreases human-induced errors. There are many automated CM tools available, and there is an entire CMMI/Dev process area focused on CM. Evaluators should determine the offeror s preferred CM tools, obtain evidence of their effective use, and insist on documented evidence of effectiveness arising from CMMI CM audits. Risk Determination One of the most critical metrics, one that has an entire CMMI/Dev process area devoted to it, is Risk. Risks come in many forms, particularly Risk management should consider both internal and external, as well as both technical and non-technical, sources of cost, schedule, performance, and other risks. [2]. Risks can be both qualitative and quantitative, depending on the particular nature of the risk and the relevant process areas that it interacts with. According to CMMI/Dev, Risk is divided into the following: definition of a risk management strategy, identifying and analyzing risks, and handling of identified risks (risk mitigation plans).[2] The first of these can be defined at the organizational level, however the risk management strategy, identification, and handling should always be tailorable and tailored to the specific project. An evaluator should examine how the risk management activities were tailored. To provide objective analysis and insight to the customer, risks should be quantitative whenever possible. Risk analysis can be unbiased with quantitative metrics. An evaluator should examine specific risk metrics and determine their unbiased objectivity.

Lessons Learned Ultimately, even with a robust CMMI infrastructure, automated tools, and thorough audits of capability, issues will arise that require innovative approaches to resolve. Evaluators should ask offerors to describe incidents where results were not what was anticipated, the steps that the offeror took to remedy the situation, and how those steps were incorporated into an organizational process to prevent/mitigate such incidents in the future. An offeror s ability to respond dynamically to change and adversity is a hallmark of the Agile process, and as such, descriptions of adverse situations can be very insightful in the evaluation process. Sources [1] Koch, Alan S. Agile Software Development: Evaluating Methods for Your Organization. Artech House, 2005. [2] CMMI For Development, Version 1.3. https://resources.sei.cmu.edu/asset_files/technicalreport/2010_005_001_15287.pdf [3] http://partners.cmmiinstitute.com/wp-content/uploads/2017/04/maturity-profile-ending-in-dec-31-2016.pdf [4] International Standards Organization. ISO 9001: Quality Systems. ISO, fifth edition, 2015. [5] Morris, Langdon. Agile Innovation: The Revolutionary Approach to Accelerate Success, Inspire Engagement, and Ignite Creativity. Wiley, 2014.