Driving Out Technical Risk by Blending Architecture, Process, and Project Discipline

Similar documents
TSP SM on an. Luis Carballo, Bursatec James McHale, SEI. TSP Symposium , 2011 Carnegie Mellon University

Combining Architecture-Centric Engineering with the Team Software Process

TSP Performance and Capability Evaluation (PACE): Customer Guide

The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects

Satisfying DoD Contract Reporting With Agile Artifacts

Fall 2014 SEI Research Review. Team Attributes &Team Performance FY14-7 Expert Performance and Measurement

Supply-Chain Risk Analysis

Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in the Acquisition of Software-Intensive Systems

Stakeholder Needs and Expectations

Eileen Forrester CMMI for Services Product Manager

A Case Study: Experiences with Agile and Lean Principles

Joint JAA/EUROCONTROL Task- Force on UAVs

Army Systems Engineering Career Development Model

Introduction to Software Product Lines Patrick Donohoe Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213

Methodology for the Cost Benefit Analysis of a Large Scale Multi-phasic Software Enterprise Migration

Implementing Open Architecture. Dr. Tom Huynh Naval Postgraduate School

Integrating CMMI and TSP/PSP: Using TSP Data to Create Process Performance Models

Kelly Black Neptune & Company, Inc.

Implementation of the Best in Class Project Management and Contract Management Initiative at the U.S. Department of Energy s Office of Environmental

Safety-Critical Systems and the TSP

Choosing the Right Measures -

Process for Evaluating Logistics. Readiness Levels (LRLs(

OCTAVE -S Implementation Guide, Version 1.0. Volume 9: Strategy and Plan Worksheets. Christopher Alberts Audrey Dorofee James Stevens Carol Woody

Flexible Photovoltaics: An Update

Predicting Disposal Costs for United States Air Force Aircraft (Presentation)

Aeronautical Systems Center

DoD Environmental Information Technology Management (EITM) Program

SEI Webinar Series: The Next Generation of Process Evolution

The Smart Grid Maturity Model & The Smart Grid Interoperability Maturity Model. #GridInterop

Characterizing PCB contamination in Painted Concrete and Substrates: The Painted History at Army Industrial Sites

Deciding in a Complex Environment : Human Performance Modeling of Dislocated Organizations

EXPLOSIVE CAPACITY DETERMINATION SYSTEM. The following provides the Guidelines for Use of the QDARC Program for the PC.

SAIC Analysis of Data Acquired at Camp Butner, NC. Dean Keiswetter

1 ISM Document and Training Roadmap. Challenges/ Opportunities. Principles. Systematic Planning. Statistical Design. Field.

Title: Human Factors Reach Comfort Determination Using Fuzzy Logic.

Agile In Government: A Research Agenda for Agile Software Development

Update on ESTCP Project ER-0918: Field Sampling and Sample Processing for Metals on DoD Ranges. Jay L. Clausen US Army Corps of Engineers, ERDC CRREL

Climate Change Adaptation: U.S. Navy Actions in the Face of Uncertainty

Avoiding Terminations, Single Offer Competition, and Costly Changes with Fixed Price Contracts

Views on Mini UAVs & Mini UAV Competition

Architecture-led Incremental System Assurance (ALISA) Demonstration

Grinding & Superfinishing. Methods & Specifications

Joint Logistics Strategic Plan

Using TSP to Improve Performance

Navy s Approach to Green and Sustainable Remediation

Transparent Ceramic Yb 3+ :Lu2O3 Materials

Incremental Sampling Methodology Status Report on ITRC Guidance

A Conceptual Model of Military Recruitment

Red-cockaded Woodpecker (RCW) Management at Fort Stewart

Improving Strategic Decision and Senior-level Teamwork in U.S. National Security Organizations

Enhance Facility Energy Management at Naval Expeditionary Base Camp Lemonnier, Djibouti. 12 May 2011

Applying Software Architecture Evaluation to Systems Acquisition

Testing Cadmium-free Coatings

STORAGE OF LIMITED QUANTITIES OF EXPLOSIVES AT REDUCED Q-D

THE NATIONAL SHIPBUILDING RESEARCH PROGRAM

Arcade Game Maker Pedagocical Product Line

How to Develop Highly Useable CMMI Documentation

Engineered Resilient Systems

Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in Source Selection of Software- Intensive Systems

Air Intakes for Subsonic UCAV Applications - Some Design Considerations

Safe Disposal of Secondary Waste

CONTRACTING ORGANIZATION: Henry M. Jackson Foundation for the Advancement of Military Medicine Rockville, MD 20852

THE TARDEC ADVANCED SYSTEMS ENGINEERING CAPABILITY (ASEC)

Safety Evaluation with AADLv2

The Personal Software Process (PSP)

Capability Maturity Model Integration (CMMI) V1.3 and Architecture-Centric Engineering

Acquisition & Management Concerns for Agile Use in Government Series. Agile Culture in the DoD

Development of an integrated ISFET ph sensor for high pressure applications in the deep-sea

Rigor and Objectivity in T&E

Wouldn't it make sense to have a way

Implementing Product Development Flow: The Key to Managing Large Scale Agile Development

We Have All Been Here Before

Novel Corrosion Control Coating Utilizing Carbon Nanotechnology

Lithium Potassium Manganese Mixed Metal Oxide Material for Rechargeable Electrochemical Cells

Prioritizing IT Controls for Effective, Measurable Security

Self Assessment and the CMMI-AM A Guide for Government Program Managers

Applying the Team Software Process

Counters for Monitoring

Oracle Systems Optimization Support

Title: Nano shape memory alloy composite development and applications

Magnesium-Rich Coatings

REPORT DOCUMENTATION PAGE

The Evolution of Product Line Assets

Integrated Systems Engineering and Test & Evaluation. Paul Waters AIR FORCE FLIGHT TEST CENTER EDWARDS AFB, CA. 16 August 2011

UNITED STATES DEPARTMENT OF INTERIOR GEOLOGICAL SURVEY DISPOSAL SITE IN NORTHEASTERN HARDEMAN COUNTY,

Protective Design. Force Protection Standards and Historic Preservation Policy. DoD Conservation Conference

EFFECT OF MATERIAL PHASE CHANGE ON PENETRATION AND SHOCK WAVES

QuEST Forum. TL 9000 Quality Management System. Requirements Handbook

Process Improvement Proposals (PIPs) Organization, Team, Individual

Stormwater Asset Inventory and Condition Assessments to Support Asset Management Session 12710

The Potential for Lean Acquisition of Software Intensive Systems

SGMM Model Definition A framework for smart grid transformation

AND Ti-Si-(Al) EUTECTIC ALLOYS Introduction. temperatur-dependent

From Virtual System Integration to Incremental Lifecycle Assurance

Agile Security Review of Current Research and Pilot Usage

DEFENSE ACQUISITION REVIEW JOURNAL

AFRL-RX-TY-TP

US Naval Facilities Engineering Service Center Environmental Program on Climate Change

Ultraviolet Curable Powder Coatings With Robotic Curing for Aerospace Applications

SIO Shipyard Representative Bi-Weekly Progress Report

Transcription:

Driving Out Technical Risk by Blending Architecture, Process, and Project Discipline Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 James McHale, Robert Nord In collaboration with: Luis Carballo (Bursatec)

Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAY 2012 2. REPORT TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Architecture, Process, and Project Discipline 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University,Software Engineering Institute,Pittsburgh,PA,15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR S ACRONYM(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 11. SPONSOR/MONITOR S REPORT NUMBER(S) 13. SUPPLEMENTARY NOTES SEI Architecture Technology User Network Conference (SATURN 2012), May 7-11, 2012, St Petersburg, FL. 14. ABSTRACT The SEI recently worked with Bursatec, the IT arm of the Mexican stock exchange, to help build an enhanced online financial trading engine. The end product had aggressive goals for performance and delivery, and it had to function flawlessly. Factors complicating a solution included scope outside the organization?s recent experience, combining key technologies in new ways, and a constant stream of new requirements?challenges familiar to many in industry. To manage these challenges while instilling disciplined but nimble engineering practices, the SEI employed its Team Software Process with architecture-centric engineering to set an integrated architecture/developer team in motion. The blend of technical, process, and project-management discipline was used to systematically address technical risk. This collaborative approach offers help to organizations to set an architecture/developer team in motion using mature, disciplined engineering practices that produce quality software quickly. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Same as Report (SAR) 18. NUMBER OF PAGES 28 19a. NAME OF RESPONSIBLE PERSON Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

Copyright 2012 Carnegie Mellon University This material is based upon work funded and supported by the Department of Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN AS-IS BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This material has been approved for public release and unlimited distribution except as restricted below. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 252.227-7013 and 252.227-7013 Alternate I. Internal use:* Permission to reproduce this material and to prepare derivative works from this material for internal use is granted, provided the copyright and No Warranty statements are included with all reproductions and derivative works. External use:* This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other external and/or commercial use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. * These restrictions do not apply to U.S. government entities. 2

A New Trading Platform Mr Téllez [BMV president] said that the new system s initial configuration will be able to handle 200,000 transactions per second, and take less than 100 microseconds to process a single order. That catapults the BMV into the same league, in terms of latency, as large exchanges such as Nasdaq OMX, NYSE Euronext, Deutsche Börse and the London Stock Exchange. 3

Experience and Results Architecture coaching, coupled with the discipline of the Team Software Process, built a competent architecture team and an architecture with successful evaluation quickly less than six months. The project objectives were met. Schedule finished on time. Quality early trials and quality metrics suggest reliability and quality goals were met. No known defects carried into final cycle! Performance a day s worth of transactions can be processed in seconds. 4

The Opportunity Background: Bolsa Mexicana de Valores (BMV) operates the Mexican financial markets. Bursatec is the technology arm of BMV. BMV desired a new trading engine to replace the existing stock market engine and integrate the options and futures markets. Bursatec committed to deliver a trading engine in 8-10 quarters. Bursatec approached the SEI for support during design and development to improve its software delivery capability. 5

The Project -1 Business Goals: High performance (as fast or faster than anything out there) Reliable and of high quality (the market cannot go down) Scalable (able to handle spikes and long-term growth in trading volume) 6

The Project -2 Architecture Decisions: Development in Java (lower Total Cost of Ownership) Low Latency Communication Multicast Network In memory data storage during trading session Hot-Hot High Availability configuration Parallel processing in Java Virtual Machine (JVM) Horizontal scalability Functional Requirements: Order routing with FIX protocol interconnect to current legacy systems. Combined Cash and Derivatives markets with a single Control Workstation. Separate Market Data and Index calculation system. 7

Trading Engine Quality and Other Attributes Quality Attributes Under 1ms processing latency Horizontal scalability Redundant high availability system Warm dual redundant system Automatic testing framework (one day turnaround attribute) Localize business rules changes in specific modules Other Attributes Backward compatible with current systems Combined platform for both markets Run on Commodity hardware 86 order type/attribute combinations (30 in current system) Real time updates to status of system via Control Workstation. 8

System Context of the Trading Engine Multicast Network Horizontal Scalability Legacy w/ Msg translation High Availability 9

Complicating Factors Given the context, one would expect risks due to: Large project scope beyond the organization s recent experience # of person-months # KLOC/function points # of interconnecting platforms # of individual projects Inexperience available staff talented but young and key implementation technologies never used together formally Constant stream of new requirements and changes to business rules 10

Solution Integrates High-Value Architecture and Team Practices Team Software Process Proven technology Management and measurement across the project lifecycle Building high-performance teams Key managers familiar with technology through word-ofmouth and literature. Architecture-Centric Engineering Proven technology Technical aspect of the early project lifecycle activities Architecting to meet business objectives Key managers familiar with technology via training courses. Architecture drove the work breakdown structure (WBS) and provided a robust framework for requirements management. 11

Architecture Drives the Lifecycle BUSINESS AND MISSION GOALS ARCHITECTURE SYSTEM Two iterative processes based on the architecture of the system: Design cycles (1, 2) The goal is to design a system that ensures business success. Implementation cycles (3, 5, 6) The goal is to implement the system according to the design. 12

ACE / TSP Design, Analysis, and Implementation Attribute Driven Design Quality Attribute Workshop Business Thread Workshop TSP Weekly Meetings and Checkpoint TSP Post-mortem ARID and TSP Relaunch TSP Weekly Meetings and Checkpoint TSP Launch TSP Postmortem BUSINESS AND MISSION GOALS TSP ARCHITECTURE TSP SYSTEM Views and Beyond Architecture Trade-off Analysis Method 13

Example Design and Implementation Strategy Stakeholders TSP Launch Iteration (6 weeks) 0 1 2 3 4 5 TSP Cycle TSP Postmortem Architecture Team Architecture Drivers Find Problems Progress Reports Design Known Design Rest ATAM Fix Architecture Adjust from Feedback Architecture and TSP Coaches Developer Team(s) Prototype Problems Skeleton + Features Skeleton + Features Corrections Features Stakeholders Releases 14

Select Process Data 35.0% 30.0% 25.0% 20.0% 15.0% 10.0% 5.0% 0.0% Planned % Actual % Effort distribution by phase blocks (% of total task hours) ~208 ekloc in 24 months Complete functionality of previous system and new functionality Latency target 1msec, achieved 0.1 msec Architectural design practices were12% of the total cost but were key in meeting the technical requirements and are estimated to have reduced the implementation costs by 10%-15% (due to avoided functionality and clean design) Only 15% effort in testing compared to normally equal distribution between coding and testing; higher than usual quality achieved 2010, 2011 Carnegie Mellon University 15

Accomplishments Performance Latency and throughput metrics greatly exceeded initial expectations (0.1 msec. vs. 1.0 msec.) Quality Very low defect count in validation test. Error density 0.1 error per KLOC compared to normal of 0.5-1.0 Defects encountered have not modified the architecture Testing framework allowed a smooth continuous integration Cost & Schedule Team achieved EVERY Milestone (internal and external) on time and budget (including unplanned new functionality), with the planned number of people. No forced march. 2010, 2011 Carnegie Mellon University 16

Key Takeaways Investment in early architecture and team practices drive the lifecycle and plays a role is managing risk. Earlier identification and resolution of defects reduces the cost of rework. Iterative and incremental approach fosters collaboration and facilitates handoffs reducing the cost of delay. Architectural practices and TSP provide a disciplined framework for measuring and managing structured intellectual activity related to the product, process, and project. 17

Questions?? For more information: Combining Architecture-Centric Engineering with the Team Software Process, Technical Report, CMU/SEI-2010-TR-031, December 2010 18

Contact Information TSP Initiative James W. Over TSP Initiative Lead jwo@sei.cmu.edu Jim McHale TSP Mentor Coach jdm@sei.cmu.edu Business Development David Scherb Greg Such dscherb@sei.cmu.edu gsuch@sei.cmu.edu RTSS Program Linda Northrop RTSS Program Director lmn@sei.cmu.edu Felix Bachmann Architecture Mentor Coach fb@sei.cmu.edu SEI website at www.sei.cmu.edu (~/tsp or ~/architecture) 19

ADDITIONAL INFORMATION 20

Project History Cycle 1 Architecture Completed Jan. 2010, demonstrated architecture coaching, evaluation of comm. packages, built test framework Cycle 2 Infrastructure implementation Completed Apr. 2010, included successful ATAM (documentation noticeably thorough, no significant new architectural risks discovered) Cycle 3 Basic functions and main performance loop Completed July 2010, good quality, performance exceeding requirements by more than a factor of 5 Cycle 4 Non-TSP cycle, outside evaluation by world-class experts Completed Aug. 2010, JVM & high-speed redundant communications Cycle 5 Full normal operations, complete performance loop Completed Jan. 2011 Cycle 6 Full functionality incl. startup, shutdown, & maintenance modes Completed July 2011 (additional scope extended scheduled June finish) 2010, 2011 Carnegie Mellon University 21

Project History -2 Cycle 7 System Test / Integration Test w/ Legacy Systems (starting Aug. 2011) Cycle 8 Acceptance Test / Parallel Test (starting Dec. 2011) Cycle 9 User Test / Deployment (starting Jan. 2012) Testing activities overlapped in part due to the Matching Engine readiness being AHEAD of other interfacing systems Includes internal test group, internal operations, brokerage firms testing (functional, HA, throughput,and DRP tests) Operations testing detected (as of Jan. 2012) < 50 unique defects in 200+ KLOC Go-Live Scheduled May 2012 (NOW!) 2010, 2011 Carnegie Mellon University 22

Getting Started TSP/ACE is introduced into an organization on a project-by-project basis. TSP Introduction Steps 1. Start by identifying candidate projects, architects, and internal architecture and TSP coach candidates. 2. Train senior management. 3. Train the selected teams and their managers, then launch the project. 4. Monitor the projects and make adjustments as needed. 5. Expand the scope to include additional projects and teams. 6. Create or expand the pool of available SEIauthorized architects, instructors and coaches. 7. Repeat starting at step 3. 23

Selecting Pilot Projects Pick 3 to 5 medium-to large-sized pilot projects. 8 to 15 team members 4 to 18 month schedule Software-intensive new development or enhancement Representative of the organization s work Important projects Select teams with members and managers who are willing to participate. Consider the group relationships. Contractors Organizational boundaries Internal conflicts 24

Architecture Training Certificate Programs Certification Six Courses Software Architecture Professional ATAM Evaluator ATAM Leader Software Architecture Principles and Practices* Documenting Software Architectures Software Architecture Design and Analysis Software Product Lines ATAM Evaluator Training ATAM Leader Training ATAM Observation : required to receive certificate / certification : available through e-learning * 25

PSP SM and TSP Training Personal Software Process (PSP SM ) training is essential to successful TSP implementation. TSP Executive Seminar (1 day for top-level execs, middle managers) TSP Team Leader Training (3 days for team leads, affected managers) PSP Fundamentals (5 days for software developers) TSP Team Member Training (3 days for other disciplines) 26

Intellectual Property Personal Software Process, PSP, Team Software Process, and TSP are service marks of Carnegie Mellon University. Architecture Tradeoff Analysis Method and ATAM are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. 27

NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. Use of any trademarks in this presentation is not intended in any way to infringe on the rights of the trademark holder. This Presentation may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 252.227-7013. 28