ISA 201 Information System Acquisition

Size: px
Start display at page:

Download "ISA 201 Information System Acquisition"

Transcription

1 ISA 201 Information System Acquisition

2 Lesson 16

3 Learning Objectives Today we will learn to: Overall: Given a DoD IT/SW acquisition scenario, apply software T&E best practices that result in system acceptance. Identify the Test & Evaluation (T&E) responsibilities of the Program Manager (PM). Describe the purpose of DoD software Test & Evaluation (T&E). Describe the DoD software T&E environment. Describe the DoD software T&E process. Identify what Independent Verification and Validation (IV&V) is. Recognize the different types of software testing tools. Identify common DoD software testing issues and risks. Recognize the Agile software development T&E challenges. Recognize the software T&E best practices. 3

4 Lesson Overview Lesson Plan Overview Software Development and Testing Process Software T&E Considerations, Issues and Trends Lessons Learned & Best Practices Summary 4

5 Testing Software Can we test quality into our software??? 5

6 Program Manager s Responsibilities for Software T&E The PM is responsible for defining the level of Software Quality for their program. This includes defining the level of maturity the software must perform at in order to declare software T&E a success. Testing must focus on testing the performance of those items the Program Manager (PM) identifies as software quality goals for the success of their program. 6

7 Software Test and Evaluation Mission The software T&E mission has two primary objectives: 1) Demonstrate performance of the whole software system 2) Assist in fault detection and correction of faults There are four (4) primary ways to support the success of the software T&E mission: 1) Your plans should include an incremental test strategy 2) Identify and correct software defects as early as possible in the lifecycle 3) Provide scientific based measures for progress and quality 4) Provide data evaluation to support acquisition decisions 7

8 Example Typical Costs of Software Fixes* Lifecycle SW Development Activity Requirements Analysis Initial $ Spent Errors Introduced Errors Found Relative Cost 5% 55% 18% 1.0 Design Activities 25% 30% 10% Testing Activities 60% 10% 50% Documentation 10% Post-Deployment Software Support (PDSS) ---* 5% 22% *Once a system is fielded, PDSS costs are typically 50-70% of total system lifecycle costs B. Boehm, Software Engineering Economics 8

9 Current State of Software T&E National Institute of Standards & Technology (NIST) states that inadequate testing methods and tools cost the U.S. economy between $22.2 and $59.5 billion annually. Testing methods in general are relatively ineffective. Testing typically only identifies from one-fourth to one-half of defects (found over a typical software lifespan) while other verification methods (e.g., formal inspections) are more effective in finding defects earlier. Inadequate and ineffective testing results in approximately 2 to 7 defects undiscovered per thousand lines of code (KLOC) when the software is deployed. This means that major software-reliant systems are being delivered and placed into operation with hundreds or even thousands of residual defects. If you include software vulnerabilities as defects, the defect rates are even more troubling. What are the problems with Software T&E? 9

10 Known Problems with Software T&E Requirements-related testing: Requirements ambiguous, missing, incomplete, incorrect, or unstable. Test planning and scheduling: Improper planning, improper test schedules. Stakeholder involvement and commitment: Not testing items important to stakeholders; Leadership commitment lacking. Management-related testing: Test resource management lacking. Test organizational and professionalism: Lack of independence, unclear testing responsibilities, and inadequate testing expertise. Test & Evaluation process: T&E and engineering processes poorly integrated. Test tools and environments: Over-reliance on manual or COTS testing tools; testing environments inadequate to fully test software. Test communication: Test documents inadequate, not maintained, not communicated. 10

11 Software Test Plan Relationships Test & Evaluation Master Plan (TEMP) Documents the overall structure and objectives of the T&E program. Includes critical operational effectiveness and suitability parameters for [among other items]...software and COMPUTER RESOURCES...it addresses SOFTWARE MATURITY and the degree to which SOFTWARE DESIGN has stabilized.. Defense Acquisition Guidebook TEMP Developmental Testing: -substantiate system technical performance Operational Testing: -meet CDD requirements S D P Software Development Plan (SDP) Coding and Unit Testing Unit Integration & Test SI Qualification Testing SI/HI Integration & Test System Qualification Testing Software Quality Assurance Corrective Action Process Software Requirements ST P Software Test Plan (STP) ID s Test Items (SIs, Units) ID s Personnel Resources ID s Test Environment Provides Test Schedules Traces to Requirements 11

12 Example of a Priority Classification Scheme J-STD-016 Classification Schemes Priority Classification Scheme Priority 1: Prevents Mission Accomplishment or jeopardizes safety or other critical requirement Priority 2: Adversely affect Mission Accomplishment or cost, schedule, performance or software support + no workaround exists Priority 3: Adversely affect Mission Accomplishment or cost, schedule, performance or software support + a workaround exists Priority 4: User/Operator or support inconvenience Priority 5: All other problems Category Classifications Project Plans Operational Concept Alternate category and priority System/Software Requirements schemes may be used by Design developers if approved by the Code acquirer Databases/data files J-STD-016, Annex M Test Plans, Descriptions, Reports User, Operator or Support Manuals Other Software Products 12 Rev 2.2

13 Types of Software Tests (Human-based and Computer-based) HUMAN-based Desk Checking Programmer inspected Walkthroughs (Informal Inspections) Programmer Team inspected. FORMAL INSPECTIONS FORMAL INSPECTIONS provide BEST ROI COMPUTER-based White Box Uses source code knowledge and executes Software Unit tests to ensure correctness Black Box No knowledge of internal source code; perspective of a Hacker Gray Box Some knowledge of internal code; perspective of a smart Hacker 13

14 Types of Software Tests (Human-based and Computer-based) Human-Based Computer-Based Formal Inspections Gray-box Black-box White Box Acceptance Testing occurs after each development phase (yellow and blue boxes) to ensure requirements are met from previous phase. Final Acceptance testing is with the warfighter and is the most important test! 14

15 Lesson Overview Lesson Plan Overview Software Development and Testing Process Software T&E Considerations, Issues and Trends Lessons Learned & Best Practices Summary 15

16 Software Development V-Model Operational Requirements (user needs) & Project Plans Concurrent Test Planning Operational & Acceptance Tests OT&E System Requirements Analysis System Design Software Requirements Analysis Software Design System Qualification Testing Subsystem Integration & Testing HI and SI Integration & Testing SI Qualification Testing DT&E Software Unit Coding Software Unit Integration & Testing Software Unit Testing 16

17 Test Readiness Review (TRR) Inputs/Entry Criteria Identified tested requirements (SRS) Traceability of test requirements to SRS & IRS has been established All software item test procedures have been completed Test objectives identified Applicable documentation, procedures and plans are complete and under control Method of documenting and dispositioning test anomalies is acceptable Remember: Typically the Software Specification Review (SSR) results in approval of the Software Requirements Specification (SRS) and the Interface Requirements Specification (IRS). Outputs/Exit Criteria Software Test Descriptions are defined, verified & baselined Testing is consistent with any defined incremental approaches Test facilities available and OK Tested software under CM Lower-level software tests done Software metrics indicate readiness to start testing Software problem report system is defined and implemented Software test baseline established Development estimates updated Nontested requirements at the SI level are identified for later testing 17

18 Software OT Maturity Criteria Software Maturity must be demonstrated prior to OT&E Software Problems Functionality Management CM Process No Priority I or II problems Impact Analysis for Priority III Functionality available before OT&E DT completed External interfaces functionally certified PMO ID s unmet critical parameters Acquisition Executive must certify (+ Operational Tester must agree) that: SW requirement & design stable Depth & breadth of testing is OK DT has demo ed required functions Deficiency reporting process in place Software CM system in place System is baselined Test Agency has access to CM system Fixes All changes completed *OSD (OT&E) Memo Subject: Software Maturity Criteria 18

19 Independent Verification and Validation (IV&V) Verification Ensuring that the system is well engineered. Answers Question: Did I build the system right? Validation Ensuring that the software meets the users needs. Answers Question: Did I build the right system? 19

20 System DT&E Decision Exit Criteria Test activities (based on the TEMP and Software Test Plan (STP)) Demonstrate that the system meets all critical technical parameters & identify technological & design risks documented in the Software Requirements Specification (SRS), Interface Requirements Specification (IRS) and Data Design Specification (DDS). Ready to perform system OT&E No open priority 1 or 2 problems Acceptable degrees of: Requirements traceability / stability (includes Cybersecurity) Computer resource utilization Design stability Breadth & depth of testing Fault profiles Reliability & Interoperability Software Maturity 20

21 Software Evaluations During IOT&E Does the software support system performance? Will the system be accepted by our customer? Is the software: - Mature & reliable? - Usable by typical operators? - Sustainable/maintainable? Testers Bottom Line all cases of software testing, reproducible failures facilitate software fixes. When you identify a problem, document it for reproducibility purposes. 21

22 Lesson Overview Lesson Plan Overview Software Development and Testing Process Software T&E Considerations, Issues and Trends Lessons Learned & Best Practices Summary 22

23 Software T&E Consideration: Software Testing Tools Automated software testing is critical for testing large and complex DoD software applications. Tool support is very useful for repetitive tasks; the computer doesn t get bored and will be able to exactly repeat repetitive tests without any mistakes. Here are some of the benefits of using automated testing tools. Test tools can: - automatically verify key functionality - do automated Regression Testing - test interfaces - test Graphical User Interfaces (GUI) - provide scenario testing models for Mission Critical Threads (MCT) - automate cloud services testing - help test teams run large numbers of tests in a shorter period of time. 23

24 Software T&E Consideration: Software Testing Tools Successful programs use Static and Dynamic Analysis tools in combination. - The use of Static Analysis and Dynamic Analysis Tools should be put on contract to ensure COTS vendors provide a picture of their code for Government purposes (Security and Product Support). Five (5) Automated Test Tool types: - Test Management Tools - Static Testing Tools - Test Specification Tools - Test execution and logging Tools - Performance and Monitoring Test 24

25 Software T&E Consideration: Software Testing Tools Example Results of Static Code Analysis Original Software Code Software Code After Many Changes 25

26 Software T&E Issues Software design issues that may affect the T&E TEMP (strategy) and Software Test Plan (STP): - Size & complexity of the software. Degree of software reuse. Amount of COTS software. - Role the software will play (safety, security risks, activities to be performed by the software, critical or high risk functions, technology maturity) - Number of interfaces / complexity of integration / number of outside systems requiring interoperability - Cost & schedule available for software testing, previous test results - Software development strategy, stability of software requirements 26

27 COTS/NDI Software T&E Issues & Risks Unit level testing/component testing is generally impossible (Black or Gray Box testing only) Incomplete documentation Multiple complex, nonstandard interfaces (interoperability is a risk) Market leverage may not exist to force vendor bug fixes Formal requirements documents unavailable COTS usage may not match original design environment Real-time performance may be marginal Robustness & reliability lower when compared to custom code Higher COTS use in a system generally implies more difficult system level integration testing (4 or more is a critical risk) Frequent market-driven releases complicate regression testing (Promote Intelligent Bypass) 27

28 Software T&E Trends IT trends that may affect T&E (all of these tend to add more complexity to software testing): - Larger & more complex software packages - More emphasis on software reuse - More challenges / more emphasis on cybersecurity - More challenges / more emphasis on interoperability (part of the Net-Ready KPP) - Rapid pace of technology advances - Open Architecture 28

29 Six (6) DoD Software Domains T&E Emphasis Mission Systems software: - Focus is on White Box testing (especially on the embedded software applications) - Does the unique hardware/software system function correctly in its intended operating environment? - System safety & mitigation of key risks - Interoperability with other known systems - Reliability usually critical C4ISR, DBS, Infrastructure, Cybersecurity and Modeling & Simulation (M&S) software: - Focus is on White, Black and Gray box testing - Does the software function correctly (often in an office or command center environment on varying hardware) - Cybersecurity, to include privacy are key risks - Interoperability through standards compliance - Reliability may not be as critical 29

30 Agile SW Development T&E Challenges Agile software development requires: - a more detailed test plan as test events occur more frequently (Agile Timeboxed Development). - intense configuration management of test results due to the frequent nature of testing. Inadequate Test Coverage and Application Programming Interface (API) Testing : - With each Sprint, use code analysis tools to monitor code changes to ensure adequate test cases. - Use tools that allow testers to test the API who don t have strong coding skills. Code Broken Accidentally due to Frequent Builds; - add tests to an automated script so you can use automated testing on a regular basis. - Automated testing tools are required to finish regression test requirements for each test. 30

31 Agile SW Development T&E Challenges (Continued) Early Detection of Defects: (Catch errors as early as possible!) - Use static analysis tools to find missing error routines, coding standard deviations, data type mismatches, other errors that can arise due to frequent build and test cycles. Performance Bottlenecks: - Identify the areas of your code are causing performance issues and how performance is being impacted over time. - Use load testing tools to help identify slow areas and track performance over time to more objectively document performance from release to release. 31

32 Lesson Overview Lesson Plan Overview Software Development and Testing Process Software T&E Considerations, Issues and Trends Lessons Learned & Best Practices Summary 32

33 Software T&E Lessons Learned Formulate test strategy prior to contract award that accommodates cost/schedule constraints Test Strategy should be able to: - Verify all critical software requirements of system - Test in a way to isolate faults Phase testing to focus on: - Qualification Testing - Operational Thread Testing - Performance/Stress Testing Resolve the requirements vs. design information argument earlyon (Watch out for Gold-Plating) Plan ahead for 1) Adequate schedule, 2) Test Regression Strategy, 3) Timing and format of deliverables and 4) Accommodating incremental builds. Understand the test--be prepared to prioritize test cases Be flexible and attuned to endof-schedule pressures Cop an attitude--know when to fall on your sword Understand the politics of the acquisition Ensure you have Separate Development, Test and Evaluation organizations to provide independent development of software and T&E. From A Real Life Experience in Testing a 1,000,000+ SLOC Ada Program STC

34 Arlie Council 16 Best Practices Arlie Council 16 Software Management Best Practices* One Example of Software Management Best Practices Product Integrity & Stability (Tester) 14) Inspect Requirements and Design 15) Manage Testing as a Continuous Process 16) Compile & Smoke Test Frequently Use Formal Inspections! Integration & Acceptance Throughout! Use Daily Compile & Smoke Testing! *As developed by the Software Program Managers Network (

35 Elephant Bungee Wisdom Universal Truth #9: COTS are not necessarily the best solution. They bring risks + benefits understand both! Management Emphasis Investigate the pricing structure Select established products with large installed bases Budget for the complete cost of COTS integration and maintenance Fly before you buy + TEST a lot Adapt your requirements to COTS Elephant Bungee Jumping #9: Avoiding Diseases that Are Fun to Catch

36 Employing IV&V on Agile Projects: Lessons Learned Identify the Control Points in the Agile Process. The control points are where decisions need to be made. Decisions on concept documentation, use cases and test results can be independently assessed by the IV&V team to ensure accuracy, completeness, consistency and traceability to the warfighter requirements. Review the Backlog. IV&V is used to ensure accuracy, completeness, consistency and testability of user stories in the backlog. IV&V assesses the desired change for impact on existing tasks and products. Manage the Roadmap. IV&V is used to ensure each iteration within a Sprint is in concert with the overall Agile Project Roadmap. Question Business Value. IV&V will review the Agile Minimum Viable Product (MVP) to ensure that what was produced in a Sprint matches the MVP. The IV&V team will validate each new user story or business requirement to ensure it adds the proper value to the warfighter. Validate Consistent Practices. The IV&V team can act as the process cop to ensure the agreed to repeatable processes in place remain in place from Sprint to Sprint. 36

37 37 Class Exercise (Which Team Can Solve the Quickest?) Short Range Assault Weapon (SRAW) Program

38 38 Class Exercise ROUTINE 6-1 Contractor Software Code

39 39 Class Exercise SRAW SOFTWARE DESIGN DIAGRAM

40 40 Class Exercise SRAW CONTRACTOR TEST REPORT

41 41 Class Exercise What type of testing did you do to discover the error?

42 Lesson Overview Lesson Plan Overview Software Development and Testing Process Software T&E Considerations, Issues and Trends Lessons Learned & Best Practices Summary 42

43 Summary Today we learned to: Overall: Given a DoD IT/SW acquisition scenario, apply software T&E best practices that result in system acceptance. Identify the T&E responsibilities of the IT Program Manager. The PM is responsible for defining software maturity as part of the Program s Software Quality definition so we know when T&E is considered successful (done). Describe the purpose of DoD software Test & Evaluation (T&E). Demonstrate performance of the whole software system. Assist in fault detection and correction of faults. Describe the DoD software T&E environment. Identify defects as early as possible! Always use FORMAL INSPECTIONS. Describe the DoD software T&E process. DoD software T&E ensures that we built the system right (verification, per spec) and that we built the right system (validation, what the warfighter wanted). Identify what Independent Verification and Validation (IV&V) is. IV&V means you have an independent team doing independent testing on critical risk areas of your software (e.g., safety). 43

44 Summary Today we learned to: Recognize the different types of software testing tools. Use of automated test tools is critical for testing today s large and complex DoD software applications. Successful programs use Static and Dynamic test tools in combination. Identify common DoD software testing issues and risks. Each DoD software domain has a different architecture. Ensure you plan your tests to address size, complexity, reuse, COTS and integration challenges of that domain. Recognize the Agile software development T&E challenges. T&E events happen more frequently in an Agile development. Use of automated tools and more frequent test plan adjustments are critical to success. Recognize the Software T&E Best Practices. There are a number of Software T&E Best Practices and Lessons Learned. Emphasize Formal Inspections for requirements, design, test plans and source code to ensure accuracy. 44

45 Future Lesson Material 45

46 ELO Given a safety-critical system scenario, determine the specific software T&E challenges MT 13.1 When doing T&E on a safety-critical system, it is a best practice to follow the procedures and recommendations in the Joint Software Systems Safety Engineering Handbook, Version 1.0, August 27, MT 13.2 Use of statistical testing is required to lower safety risk. Statistical testing is where you test a representative subset of the complex code and extrapolate the test results across the entire application to predict the overall quality of the code. MT 13.3 Cleanroom is one example of software development-t&e option for a safety-critical environment. Cleanroom is an example of Extreme Team Desk Checking. MT 13.4 Use of Static and Dynamic Analysis tools is critical when doing T&E on safety-critical systems because the goal is highly optimized, defect free code. Learning Points in Notes Section 46

47 ELO Given a Cybersecurity-critical system scenario, determine the specific software T&E challenges MT 14.1 Planning and executing cybersecurity DT&E should occur early in the acquisition lifecycle, beginning before MS A or as early as possible in the acquisition lifecycle. MT 14.2 Test activities should integrate RMF security control assessments with tests of commonly exploited and emerging vulnerabilities early in the acquisition life cycle. More information on RMF security controls is available in the RMF KS at MT 14.3 The TEMP should detail how testing will provide the information needed to assess cybersecurity and inform acquisition decisions. Historically, TEMPs and associated test plans have not adequately addressed cybersecurity measures or resources. MT 14.4 The cybersecurity T&E phases support the development and testing of missiondriven cybersecurity requirements, which may require specialized systems engineering and T&E expertise. The Chief Developmental Tester may request assistance from SMEs such as vulnerability testers and adversarial testers (Red Team-type representatives) to assist in implementation of cybersecurity testing. MT 14.5 Cybersecurity T&E requires additional time, money and possibly resources (e.g., National Security Agency (NSA) certified assessors) to perform Red Team events (e.g., Adversarial assessment). MT 14.6 The Adversarial DT&E Team (i.e., Red Team) should meet with the Chief Developmental Tester to develop a detailed test plan. The team will share its rules of engagement and will describe its threat portrayal based on its knowledge and the information provided by the program. Through its analysis, the team will identify assets of value, system processes, vulnerabilities, attack plans and methods, and scheme types and indicators. 47 LEARNING POINT IN NOTES SECTION

48 ELO Given a Privacy-critical system scenario, determine the specific software T&E challenges. MT 15.1 When doing T&E for Defense Business Systems (DBS), it is a best practice to use the NIST SP , Security and Privacy Controls for Federal Information Systems and Organizations, to identify test procedures to ensure Personally Identifiable Information (PII) is secure. MT 15.2 When testing large Defense Business Systems (DBS), don t use large volumes of data for software unit, integration, regression and quality assurance testing as the data volume is too much to test in a reasonable period of time. There are sampling techniques and automated tools to help test large volumes of data used by a DBS. Learning Points in Notes Section 48