Common System and Software Testing Pitfalls

Size: px
Start display at page:

Download "Common System and Software Testing Pitfalls"

Transcription

1 Common System and Software Testing Pitfalls Donald Firesmith Software Solutions Conference 2015 November 16 18, 2015

2 Copyright 2015 Carnegie Mellon University This material is based upon work funded and supported by the Department of Defense under Contract No. FA C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Department of Defense. NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN AS-IS BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This material has been approved for public release and unlimited distribution except as restricted below. This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. DM

3 Agenda Introduction Presentation Scope What is Testing? Why Test? Limitations and Challenges of Testing Testing Pitfalls: Software vs. System Testing Pitfall Taxonomy and Ontology Addressing Testing Challenges Goals and Potential Uses Example Pitfall Taxonomy of Common Testing Pitfalls (lists of pitfalls by category): General Pitfalls Test-Type-Specific Pitfalls Conclusion Remaining Limitations and Questions Future Work 3

4 A Taxonomy of System & Software Testing Types Introduction 17 November 2015 Carnegie 2015 Mellon University 4 Distribution Distribution is Statement Unlimited A: Approved for Public Release;

5 Presentation Scope Verification and Validation (V&V) Methods Quality Control (QC) Static Legend Dynamic In Scope Test Evaluation T&E Other Methods Analysis Demonstration Inspection Accreditation Reuse Warantee Static Analysis Dynamic Analysis Execution Simulation Audit Desk Checking Inspection Review Walk Through Deterministic Analysis Statistical Analysis Fagan Inspection Peer Review Formal Review 5

6 What is Testing? Testing The execution of an Object Under Test (OUT) under specific preconditions with specific stimuli so that its actual behavior can be compared with its expected or required behavior Preconditions: pretest mode, states, stored data, or external conditions Stimuli: Calls, commands, and messages (control flows) Data inputs (data flows) Trigger events such as state changes and temporal events Actual Behavior: During Test: Calls, commands, and messages (control flows) Data outputs (data flows) Postconditions: post-test mode, states, stored data, or external conditions 6

7 What is Testing? 2 Testing requires: Controllability to: Establish preconditions Provide input stimuli Observability to verify: Establishment of preconditions Correctness of stimuli Correctness of outputs Correctness of postconditions 7

8 Why Test? In roughly decreasing order of importance, the goals of testing are to: Uncover Significant Defects in the object under test (OUT) by causing it to behave incorrectly (e.g., to fail or enter a faulty state) so that these underlying defects can be identified and fixed and the OUT can thereby be improved. Provide Evidence that can be used to determine the OUT s: Quality Fitness for purpose Readiness for shipping, deployment, or being placed into operation Support Process Improvement by helping to identify: Development processes that introduce defects Testing processes that fail to uncover defects Prevent Defects by: Testing executable requirements, architecture, and design models so that defects in the models are fixed before they can result in defects in the system/software. Using Test Driven Development (TDD) to develop test cases and then using these test cases to drive development (design and implementation) 8

9 Limitations of Testing Cannot be exhaustive (or even close) Cannot uncover all defects because different types of testing: Have different defect removal efficiencies (DREs) Uncover different types of defects May provide false positive and false negative results due to: Defects in the test case (e.g., incorrect test input or incorrect test data) Defects in the test environment(s) Poor configuration management of the OUT, the test environment, and the test case Cannot prove the OUT works properly under all inputs and conditions 9

10 Testing-Pitfall-Related Challenges A great many different ways exist to screw up testing. Multiple testing pitfalls are observed on just about every project. Different programs often exhibit different testing pitfalls. In spite of many excellent how-to testing books, we see projects falling into these same testing pitfalls over and over again. 10

11 A Taxonomy of System & Software Testing Types Testing Pitfalls 17 November 2015 Carnegie 2015 Mellon University 11 Distribution Distribution is Statement Unlimited A: Approved for Public Release;

12 Testing Pitfalls Testing Pitfall A human mistake that unnecessarily and unexpectedly causes testing to be: Less effective at uncovering significant defects Less efficient in terms of time and effort expended Less satisfying and fun to perform A bad decision, an incorrect mindset, a wrong action, or failure to act A failure to adequately: Meet a testing challenge Address a testing problem A way to screw up testing Common Testing Pitfall Observed numerous times on different projects Has sufficient frequency (and consequences ) to be a significant risk 12

13 Software vs. System Testing These testing pitfalls occur when testing: Software (applications, components, units) Systems (subsystems, hardware, software, data, personnel, facilities, equipment, documentation, etc.) Executable models (requirements, architecture, design) Most pitfalls apply to both software and system testing. The vast majority of software testers must address system testing issues: Software executes on hardware, and how well it executes depends on: That hardware Other software running on the same hardware Software communicates over: External networks (Internet, NIPRNet, SIPRNet, WAN, LAN, MAN, etc.) Data-center-internal networks connecting servers and data libraries (e.g., SAN) Busses within systems (embedded software) Software must meet quality requirements (thresholds of relevant quality characteristics and attributes) that are actually system-level, not software-level. 13

14 Pitfall Taxonomy and Ontology Testing Pitfall Taxonomy A hierarchical classification of testing pitfalls into categories and subcategories Testing Pitfall Ontology A hierarchy of concepts concerning testing pitfalls that uses a shared vocabulary to denote the types, properties, and interrelationships of these concepts 14

15 Testing Pitfall Taxonomy and Ontology 15

16 Addressing these Challenges Testing Pitfalls Taxonomy and Ontology Anti-Pattern Language of how-not-to do testing (Addison-Wesley, 2014) (Note: 35% conference discount) 92 pitfalls classified into 14 categories Technically reviewed by 47 international testing SMEs Current taxonomy/repository with new pitfalls and pitfall categories: 176 pitfalls classified into 24 categories SEI Testing Pitfalls Wiki (currently under development) 16

17 Goals To become the de facto industry-standard taxonomy of testing pitfalls To reduce the incidence of testing pitfalls and thereby improve testing effectiveness and efficiency To improve the quality of the objects under test (OUTs) 17

18 Potential Uses Training materials for testers and testing stakeholders Standard terminology regarding commonly occurring testing pitfalls Checklists for use when: Producing test strategies/plans and related documentations Evaluating contractor proposals Evaluating test strategies/plans and related documentation (quality control) Evaluating as-performed test process (quality assurance) Identifying test-related risks and their mitigation approaches Knowledge base for relatively simple test pitfall advice tool Categorization of pitfalls for test metrics collection, analysis, and reporting 18

19 Example Testing and Engineering Processes Not Integrated (GEN-PRO-7) Description The testing process(es) are not adequately integrated into the overall system/software engineering process(es). Potential Applicability This pitfall is potentially applicable anytime that engineering and testing processes both exist. 19

20 Example Testing and Engineering Processes Not Integrated (GEN-PRO-7) Characteristic Symptoms There is little or no discussion of testing in the system engineering documentation: System Engineering Management Plan (SEMP), Software Development Plan (SDP), Work Breakdown Structure (WBS), Project Master Schedule (PMS), or System Development Cycle (SDC). All or most of the testing is done as a completely independent activity performed by testers who are not part of the project engineering team. Testing is treated as a separate specialty engineering activity with only limited interfaces with the primary engineering activities. Test scheduling is independent of the scheduling of other development activities. Testers are not included in the requirements teams, architecture teams, or any cross-functional engineering teams. Testers are not invited to engineering reviews or walk-throughs. 20

21 Example Testing and Engineering Processes Not Integrated (GEN-PRO-7) Potential Negative Consequences Few nontesters understand the scope, complexity, and importance of testing. Testers do not understand the work being performed by other engineers including both technology and application domain. Testers are unaware of changes to the requirements, architecture, design, and implementation. Testing is less effective and less efficient. The system/software engineering process and testing process are incompatible. The project master schedule and the testing schedule are incompatible. Testing work products are not delivered in time for major project reviews and decision points. Engineering work products are not sufficiently testable. Testing risks are not incorporated into the project risk repository. The requirements engineering, architecture engineering, design, and implementation tools are not sufficiently compatible with the testing tools. Etc. 21

22 Example Testing and Engineering Processes Not Integrated (GEN-PRO-7) Potential Causes There was inadequate communication between testers and other system or software engineers (for example,). Requirements engineers, architects, designers, and implementers do not communicate changes to the requirements, architecture, design, and implementation to the testers. Testers were not involved in: Initial overall planning Determining and documenting the overall engineering process Testers did not have sufficient expertise or experience in the relevant technology and the application domain. System and software engineers did not respect the testers, feeling that they were not qualified to take part in project. System and software engineers were not involved with the verification of the testing work products. The people determining and documenting the overall engineering process did not have significant testing expertise, training, or experience. The testing schedule was not integrated into the overall project schedule. Testing was outsourced. Test tool selection drove the testing process rather than the other way around. 22

23 Example Testing and Engineering Processes Not Integrated (GEN-PRO-7) Recommendations Prepare: Include testers in the initial staffing of the project. Enable: Provide a top-level briefing or training in testing to the chief system engineer, system architect, and process engineer. Perform: Subject-matter experts and project testers collaborate closely with the project chief engineer or technical lead and process engineer when they develop the engineering process descriptions and associated process documents. Provide high-level overviews of testing in the SEMP(s) and SDP(s). Document how testing is integrated into the system development or life cycle, regardless of whether it is traditional waterfall, evolutionary (iterative, incremental, and parallel), or anything in between. For example, document handover points in the development cycle when testing input and output work products are delivered from one project organization or group to another. Incorporate testing into the Project Master Schedule. Incorporate testing into the project s Work Breakdown Structure (WBS). Verify: Determine whether testers were involved in planning the project s system or software development process. Determine whether testing is incorporated into the project s System engineering process, System development cycle, System Engineering Master Plan and System Development Plan, Work Breakdown Structure, Master Schedule 23

24 Example Testing and Engineering Processes Not Integrated (GEN-PRO-7) Related Pitfalls Testing at the End (GEN-TPS-6) If the testing and engineering processes are not properly integrated, then testing is more likely to be delayed to the end of development. Independent Test Schedule (GEN-TPS-7) If the testing and engineering processes are not integrated, then the test schedule will be independent of and therefore probably incompatible with the overall project master schedule. Testers Responsible for All Testing (GEN-STF-4) If the testing and engineering processes are not properly integrated, then the developers are more likely to believe that the testers are responsible for all testing. Adversarial Relationship (GEN-STF-9) If the testing and engineering processes are not integrated, then the developers and the testers are more likely to develop and adversarial rather than cooperative relationship. Testing as a Phase (GEN-PRO-18) If the testing and engineering processes are not properly integrated, then testing is more likely to be viewed as a phase that is separate from the rest of development. Testers Not Involved Early (GEN-PRO-19) If the testing and engineering processes are not properly integrated, then testers are less likely to be involved early in the development process (such as during initial planning, requirements engineering, and architecture engineering. Testing in Quality (GEN-PRO-23) If the testing and engineering processes are not properly integrated, then the developers will be more likely to believe that quality is the testers responsibility and that quality can be testing into the system or software. Developers Ignore Testability (GEN-PRO-24) If the testing and engineering processes are not properly integrated, then the developers are more likely to ignore testability. Inadequate Communication Concerning Testing (GEN-COM-5) If the testing and engineering processes are not properly integrated, then it is more likely that there will be inadequate communication between the testers and the rest of the engineering staff. 24

25 A Taxonomy of System & Software Testing Types The Taxonomy 17 November 2015 Carnegie 2015 Mellon University 25 Distribution Distribution is Statement Unlimited A: Approved for Public Release;

26 A Taxonomy of System & Software Testing Types Taxonomy General Pitfalls 17 November 2015 Carnegie 2015 Mellon University 26 Distribution Distribution is Statement Unlimited A: Approved for Public Release;

27 Categories of Testing Pitfalls General 1 Test Planning and Scheduling 2 Stakeholder Involvement and Commitment 3 Management 4 Staffing 5 Testing Process 6 Test Design 7 Pitfall-Related [new] 8 Test Tools [new - split] 9 Test Environments [new - split] 10 Automated Testing [new] 11 Test Communication 12 Testing-as-a-Service (TaaS) [new] 13 Requirements 14 Test Data [new] 27

28 General Pitfalls Test Planning and Scheduling No Separate Test Planning Documentation (GEN-TPS-1) Incomplete Test Planning (GEN-TPS-2) Test Plans Ignored (GEN-TPS-3) Test-Case Documents as Test Plans (GEN-TPS-4) Inadequate Test Schedule (GEN-TPS-5) Testing at the End (GEN-TPS-6) Independent Test Schedule (GEN-TPS-7) [new] 28

29 General Pitfalls Stakeholder Involvement and Commitment Wrong Testing Mindset (GEN-SIC-1) Unrealistic Testing Expectations (GEN-SIC-2) Assuming Testing Only Verification Method Needed (GEN-SIC-3) Mistaking Demonstration for Testing (GEN-SIC-4) Lack of Stakeholder Commitment to Testing (GEN-SIC-5) 29

30 General Pitfalls Management Inadequate Test Resources (GEN-MGMT-1) Inappropriate External Pressures (GEN-MGMT-2) Inadequate Test-Related Risk Management (GEN-MGMT-3) Inadequate Test Metrics (GEN-MGMT-4) Inconvenient Test Results Ignored (GEN-MGMT-5) Test Lessons Learned Ignored (GEN-MGMT-6) Inadequate Test-Related Configuration Management (GEN-MGMT- 7) [new] 30

31 General Pitfalls Staffing - 1 Lack of Independence (GEN-STF-1) Unclear Testing Responsibilities (GEN-STF-2) Developers Responsible for All Testing (GEN-STF-3) Testers Responsible for All Testing (GEN-STF-4) Testers Responsible for Ensuring Quality (GEN-STF-5) [new] Testers Fix Defects (GEN-STF-6) [new] Users Responsible for Testing (GEN-STF-7) [new] Inadequate Testing Expertise (GEN-STF-8) Inadequate Domain Expertise (GEN-STF-9) [new] Adversarial Relationship (GEN-STF-10) [new] 31

32 General Pitfalls Staffing - 2 Too Few Testers (GEN-STF-11) [new] Allowing Developers to Close Discrepancy Reports (GEN-STF- 12) [new] Testing Death March (GEN-STF-13) [new] All Testers Assumed Equal (GEN-STF-14) [new] 32

33 General Pitfalls Testing Process - 1 No Planned Testing Process (GEN-PRO-1) [new] Essentially No Testing (GEN-PRO-2) [new] Inadequate Testing (GEN-PRO-3) Testing Process Ignored (GEN-PRO-4) [new] One-Size-Fits-All Testing (GEN-PRO-5) Testing and Engineering Processes Not Integrated (GEN-PRO-6) Too Immature for Testing (GEN-PRO-7) Inadequate Evaluations of Test Assets (GEN-PRO-8) Inadequate Maintenance of Test Assets (GEN-PRO-9) Test Assets Not Delivered (GEN-PRO-10) Testing as a Phase (GEN-PRO-11) 33

34 General Pitfalls Testing Process - 2 Testers Not Involved Early (GEN-PRO-12) Developmental Testing During Production (GEN-PRO-13) [new] No Operational Testing (GEN-PRO-14) Testing in Quality (GEN-PRO-15) [new] Developers Ignore Testability (GEN-PRO-16) [moved from other category] Failure to Address the Testing BackBlob (GEN-PRO-17) [new] Failure to Analyze Why Defects Escaped Detection (GEN-PRO-18) [new] Official Test Standards are Ignored (GEN-PRO-20) [new] Official Test Standards are Slavishly Followed (GEN-PRO-21) [new] Developing New When Old Fails Tests (GEN-PRO-22) [new] Integrating new or Updates When Fails Tests (GEN-PRO-23) [new] 34

35 General Pitfalls Test Design [new] Sunny Day Testing Only (GEN-TD-1) [new] Inadequate Test Prioritization (GEN-TD-2) Test-Type Confusion (GEN-TD-3) Functionality Testing Overemphasized (GEN-TD-4) System Testing Overemphasized (GEN-TD-5) System Testing Underemphasized (GEN-TD-6) Test Preconditions Ignored (GEN-TD-7) [new] Test Oracles Ignore Nondeterministic Behavior (GEN-TD-8) [new] No Test Design (GEN-TD-9) [new] Test Design Ignores Test Architecture (GEN-TD-10) [new] No Test Case Selection Criteria (GEN-TD-11) [new] No Test Case Completion Criteria (GEN-TD-12) [new] Inadequate Test Oracle (GEN-TD-13) [new] 35

36 General Pitfalls Pitfall-Related [new] Overly Ambitious Process Improvement (GEN-PRP-1) [new] Inadequate Pitfall Prioritization (GEN-PRP-2) [new] 36

37 General Pitfalls Test Tools [new] Over-Reliance on Test Tools (GEN-TT-1) Cost Only / Primary Test Tool Selection Criteria (GEN-TT-2) [new] Test Tools Drive Testing Process (GEN-TT-3) [new] Unusable Test Tool (GEN-TT-4) [new] Incompatible Test Tools (GEN-TT-5) [new] Inadequate Test Tool Training (GEN-TT-6) [new] 37

38 General Pitfalls Test Environments Incomplete Test Environment (GEN-TE-1) [new] Poor Preparation for Numerous Platforms (GEN-TE-2 ) [new] Insufficient Number of Test Environments (GEN-TE-3) Missing Type of Test Environment (GEN-TE-4) [new] Poor Fidelity of Test Environments (GEN-TE-5) Inadequate Test Environment Quality (GEN-TE-6) Test Environments Inadequately Tested (GEN-TE-7) [new] Inadequate Testing in a Staging Environment (GEN-TE-8) [new] Insecure Test Environment (GEN-TE-9) [new] Improperly Configured Test Environment (GEN-TE-10) [new] 38

39 General Pitfalls Automated Testing - 1 [new] Automated Testing not Treated as a Project (GEN-AUTO-1) [new] Insufficient Automated Testing (GEN-AUTO-2) [new] Automated Testing Replaces Manual Testing (GEN-AUTO-3) [new] Automated Testing Replaces Testers (GEN-AUTO-4) [new] Inappropriate Distribution of Automated Tests (GEN-AUTO-5) [new] Inadequate Automated Test Quality (GEN-AUTO-6) [new] Excessively Complex Automated Tests (GEN-AUTO-7) [new] Automated Tests Not Maintained (GEN-AUTO-8) [new] Insufficient Resources Invested (GEN-AUTO-9) [new] Inappropriate Automation Tools (GEN-AUTO-10) [new] 39

40 General Pitfalls Automated Testing - 2 [new] Unclear Responsibilities for Automated Testing (GEN-AUTO-11) [new] Postponing Automated Testing Until Stable (GEN-AUTO-12) [new] Automated Testing as Silver Bullet (GEN-AUTO-13) [new] Incompatible Automation Tools (GEN-AUTO-14) [new] Testers Make Good Test Automation Engineers (GEN-AUTO-15) [new] 40

41 General Pitfalls Test Communication Inadequate Architecture or Design Documentation (GEN-COM-1) Inadequate Discrepancy Reports (GEN-COM-2) Inadequate Test Documentation (GEN-COM-3) Source Documents Not Maintained (GEN-COM-4) Inadequate Communication Concerning Testing (GEN-COM-5) Inconsistent Testing Terminology (GEN-COM-6) [new] Redundant Test Documents (GEN-COM-7) [new] 41

42 General Pitfalls Testing-as-a-Service (TaaS) [new] Cost-Driven Provider Selection (GEN-TaaS-1 ) [new] Inadequate Oversight (GEN-TaaS-2) [new] Inadequate Collaboration (GEN-TaaS-3) [new] Lack of Outsourcing Expertise (GEN-TaaS-4) [new] Inappropriate TaaS Contract (GEN-TaaS-5) [new] TaaS Provider Improperly Chosen (GEN-TaaS-6) [new] No or Late Delivery of Testing Assets (GEN-TaaS-7) [new] TaaS Vendor Obtains Vendor Lock (GEN-Taas-8) [new] 42

43 General Pitfalls Requirements Tests as Requirements (GEN-REQ-1) [new] Ambiguous Requirements (GEN-REQ-2) Obsolete Requirements (GEN-REQ-3) Missing Requirements (GEN-REQ-4) Incomplete Requirements (GEN-REQ-5) Incorrect Requirements (GEN-REQ-6) Requirements Churn (GEN-REQ-7) Improperly Derived Requirements (GEN-REQ-8) Verification Methods Not Adequately Specified (GEN-REQ-9) Lack of Requirements Trace (GEN-REQ-10) Titanic Effect of Deferred Requirements (GEN-REQ-11) [new] Implicit Requirements Ignored (GEN-REQ-12) [new] 43

44 General Pitfalls Test Data [new] Inadequate Test Data (GEN-DATA-1) [moved from other category] Poor Fidelity of Test Data (GEN-DATA-2) [new] Production Data in Test Data (GEN-DATA-3) [new] Test Data in Production Data (GEN-DATA-4) [new] Excessive Test Data (GEN-DATA-5) [new] 44

45 A Taxonomy of System & Software Testing Types Taxonomy Test-Type-Specific Pitfalls 17 November 2015 Carnegie 2015 Mellon University 45 Distribution Distribution is Statement Unlimited A: Approved for Public Release;

46 Categories of Testing Pitfalls Test-Type-Specific Pitfalls 1 Executable Model Testing [new] 2 Unit Testing 3 Integration Testing 4 Specialty Engineering Testing 5 System Testing 6 User Testing [new] 7 A/B Testing [new] 8 Acceptance Testing [new] 9 Operational Testing [new] 10 System of Systems (SoS) Testing 11 Regression Testing 46

47 Test Type Specific Pitfalls Executable Model Testing [new] Inadequate Executable Models (TTS-MOD-1) [new] Executable Models Not Tested (TTS-MOD-2) [new] 47

48 Test Type Specific Pitfalls Unit Testing Testing Does Not Drive Design and Implementation (TTS-UNT-1) Conflict of Interest (TTS-UNT-2) Untestable Units (TTS-UNT-3) [new] Brittle Test Cases (TTS-UNT-4) [new] No Unit Testing (TTS-UNT-5) [new] Unit Testing of Automatically Generated Units (TTS-UNT-6) [new] 48

49 Test Type Specific Pitfalls Integration Testing Integration Decreases Testability Ignored (TTS-INT-1) Inadequate Self-Testing (TTP-INT-2) Unavailable Components (TTS-INT-3) System Testing as Integration Testing (TTS-INT-4) 49

50 Test Type Specific Pitfalls Specialty Engineering Testing Inadequate Capacity Testing (TTS-SPC-1) Inadequate Concurrency Testing (TTS-SPC-2) Inadequate Configurability Testing (TTS-SPC-3) [new] Inadequate Interface Standards Compliance Testing (TTS-SPC-4) [new] Inadequate Internationalization Testing (TTS-SPC-5) Inadequate Interoperability Testing (TTS-SPC-6) Inadequate Performance Testing (TTS-SPC-7) Inadequate Portability Testing (TTS-SPC-8) Inadequate Reliability Testing (TTS-SPC-9) Inadequate Robustness Testing (TTS-SPC-10) Inadequate Safety Testing (TTS-SPC-11) Inadequate Security Testing (TTS-SPC-12) Inadequate Usability Testing (TTS-SPC-13) 50

51 Test Type Specific Pitfalls System Testing Test Hooks Remain (TTS-SYS-1) Lack of Test Hooks (TTS-SYS-2) Inadequate End-to-End Testing (TTS-SYS-3) 51

52 Test Type Specific Pitfalls User Testing [new] Inadequate User Involvement (TTS-UT-1) [new] Unprepared User Representatives (TTS-UT-2) [new] User Testing Merely Repeats System Testing (TTS-UT-3) [new] User Testing is Mistaken for Acceptance Testing (TTS-UT-4) [new] Assuming Knowledgeable and Careful Users (TTS-UT-5) [new] User Testing Too Late to Fix Defects (TTS-UT-6) [new] 52

53 Test Type Specific Pitfalls A/B Testing [new] Poor Key Performance Indicators (TTS-ABT-1) [new] Misuse of Probability and Statistics (TTS-ABT-2) [new] Confusing Statistical Significance with Business Significance (TTS-ABT-3) [new] Error Source(s) Not Controlled (TTS-ABT-4) [new] System Variant(s) Changed During Test (TTS-ABT-5) [new] 53

54 Test Type Specific Pitfalls Acceptance Testing [new] No Clear System Acceptance Criteria (TTS-AT-1) [new] Acceptance Testing Only Tests Functionality (TTS-AT-2) [new] Developers Determine Acceptance Tests (TTS-AT-3) [new] 54

55 Test Type Specific Pitfalls Operational Testing (OT) [new] No On-Site SW Developers (TTS-OT-1) [new] Inadequate Operational Testing (TTS-OT-2) [new] 55

56 Test Type Specific Pitfalls System of System (SoS) Testing Inadequate SoS Test Planning (TTS-SoS-1) Unclear SoS Testing Responsibilities (TTS-SoS-2) Inadequate Resources for SoS Testing (TTS-SoS-3) SoS Testing not Properly Scheduled (TTS-SoS-4) Inadequate SoS Requirements (TTS-SoS-5) Inadequate Support from Individual System Projects (TTS-SoS-6) Inadequate Defect Tracking Across Projects (TTS-SoS-7) Finger-Pointing (TTS-SoS-8) 56

57 Test Type Specific Pitfalls Regression Testing Inadequate Automation of Regression Testing (TTS-REG-1) Regression Testing Not Performed (TTS-REG-2) Inadequate Scope of Regression Testing (TTS-REG-3) Only Low-Level Regression Testing (TTS-REG-4) Test Assets Not Delivered (TTS-REG-5) Only Functional Regression Testing (TTS-REG-6) Inadequate Retesting of Reused Software (TTS-REG-7) [new] Only Automated Regression Testing (TTS-REG-8) [new] Regression Tests Not Maintained (TTS-REG-9) [new] 57

58 A Taxonomy of System & Software Testing Types Conclusion 17 November 2015 Carnegie 2015 Mellon University 58 Distribution Distribution is Statement Unlimited A: Approved for Public Release;

59 Remaining Limitation and Questions Current Taxonomy is Experience Based: Based on experience testing and assessing testing programs (author, SEI ITAs, technical reviewers) Not the result of documentation study or formal academic research Remaining Questions: Which pitfalls occur most often? With what frequency? Top 10 most common? Which pitfalls cause the most harm? Top 10 wrt. harm? Which pitfalls have the highest risk (expected harm = harm frequency x harm)? What factors influence frequency, harm, and risk? System/software (size, complexity, criticality, application domain, software only vs. HW/SW/people/documentation/facilities/procedures, system vs. SoS vs. PL, development/test processes)? Project (type, formality, lifecycle scope, schedule, funding, commercial vs. government/military, ) Organization (number, size, type, governance, management/engineering culture ) 59

60 Future Work SEI Wiki (currently under development) Extensive Technical Review: New testing pitfall categories New and modified testing pitfalls Proper Industry Survey: Design of Experiment (DoE) Creation of survey questions Size needed for statistical validity Analysis of test results Reporting of test results Simple Expert System: Inputs: answers to survey questions Outputs: Estimated highest risk pitfalls 60

61 Contact Information Slide Format Donald G. Firesmith Principal Engineer Software Solutions Division Telephone: Web U.S. Mail Software Engineering Institute Customer Relations 4500 Fifth Avenue Pittsburgh, PA USA Customer Relations Telephone: SEI Phone: SEI Fax:

Introduction to Software Product Lines Patrick Donohoe Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213

Introduction to Software Product Lines Patrick Donohoe Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Introduction to Software Product Lines Patrick Donohoe Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 2014 by Carnegie Mellon University Copyright 2014 Carnegie Mellon University

More information

Safety Evaluation with AADLv2

Safety Evaluation with AADLv2 Safety Evaluation with AADLv2 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Julien Delange 09/24/2013 Agenda Overview of AADL Error-Model Annex Approach for Safety Evaluation

More information

A Case Study: Experiences with Agile and Lean Principles

A Case Study: Experiences with Agile and Lean Principles A Case Study: Experiences with Agile and Lean Principles Jeff Davenport Software Solutions Conference 2015 November 16 18, 2015 Copyright 2015 Carnegie Mellon University This material is based upon work

More information

Agile In Government: A Research Agenda for Agile Software Development

Agile In Government: A Research Agenda for Agile Software Development Agile In Government: A Research Agenda for Agile Software Development Will Hayes Suzanne Miller Eileen Wrubel Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 March 201720

More information

Architecture-led Incremental System Assurance (ALISA) Demonstration

Architecture-led Incremental System Assurance (ALISA) Demonstration Architecture-led Incremental System Assurance (ALISA) Demonstration Peter Feiler Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 [DISTRIBUTION STATEMENT A] This material

More information

Methodology for the Cost Benefit Analysis of a Large Scale Multi-phasic Software Enterprise Migration

Methodology for the Cost Benefit Analysis of a Large Scale Multi-phasic Software Enterprise Migration Methodology for the Cost Benefit Analysis of a Large Scale Multi-phasic Software Enterprise Migration Bryce Meyer Jerry Jackson Jim Wessel Software Engineering Institute Carnegie Mellon University Pittsburgh,

More information

Agile Security Review of Current Research and Pilot Usage

Agile Security Review of Current Research and Pilot Usage Agile Security Review of Current Research and Pilot Usage Carol Woody April 2013 OVERVIEW This paper was produced to focus attention on the opportunities and challenges for embedding information assurance

More information

JOURNAL OF OBJECT TECHNOLOGY

JOURNAL OF OBJECT TECHNOLOGY JOURNAL OF OBJECT TECHNOLOGY Online at http://www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2003 Vol. 2, No. 5, September - October 2003 Using Quality Models to Engineer Quality

More information

From Virtual System Integration to Incremental Lifecycle Assurance

From Virtual System Integration to Incremental Lifecycle Assurance From Virtual System Integration to Incremental Lifecycle Assurance Peter H. Feiler Software Solutions Conference 2015 November 16 18, 2015 Copyright 2015 Carnegie Mellon University This material is based

More information

AUTOMOTIVE SPICE v3.1 POCKET GUIDE

AUTOMOTIVE SPICE v3.1 POCKET GUIDE EXTENDED VDA SCOPE ASPICE v3.1 AUTOMOTIVE SPICE v3.1 POCKET GUIDE 4 5 6 7 8-9 10 11-13 14-15 16-19 20-43 44-49 50-51 52-69 70-93 94-103 104-105 106 Automotive SPICE at a glance Automotive SPICE application

More information

Introduction to Software Engineering

Introduction to Software Engineering Introduction to Software Engineering 2. Requirements Collection Mircea F. Lungu Based on a lecture by Oscar Nierstrasz. Roadmap > The Requirements Engineering Process > Functional and non-functional requirements

More information

Optional Inner Title Slide

Optional Inner Title Slide Leading SAFe / Agile in Government for Executives: Overview January 2017 SuZ Miller, Eileen Wrubel SEI Agile in Government Team Optional Inner Title Slide Name optional 2016 Carnegie Mellon University

More information

Prioritizing IT Controls for Effective, Measurable Security

Prioritizing IT Controls for Effective, Measurable Security Prioritizing IT Controls for Effective, Measurable Security Daniel Phelps Gene Kim Kurt Milne October 2006 ABSTRACT: This article summarizes results from the IT Controls Performance Study conducted by

More information

Implementing Product Development Flow: The Key to Managing Large Scale Agile Development

Implementing Product Development Flow: The Key to Managing Large Scale Agile Development Implementing Product Development Flow: The Key to Managing Large Scale Agile Development Will Hayes SEI Software Solutions Conference 2015 November 16 18, 2015 Copyright 2015 Carnegie Mellon University

More information

We Have All Been Here Before

We Have All Been Here Before We Have All Been Here Before Recurring Patterns Across 12 U.S. Air Force Acquisition Programs William E. Novak Ray C. Williams Introduction Agenda Introduction Independent Technical Assessments (ITAs)

More information

TSP SM on an. Luis Carballo, Bursatec James McHale, SEI. TSP Symposium , 2011 Carnegie Mellon University

TSP SM on an. Luis Carballo, Bursatec James McHale, SEI. TSP Symposium , 2011 Carnegie Mellon University TSP SM on an Architecture-Driven Project Luis Carballo, Bursatec James McHale, SEI Robert Nord, SEI TSP Symposium 2011 1 The Opportunity Background: Bolsa Mexicana de Valores (BMV) operates the Mexican

More information

OCTAVE -S Implementation Guide, Version 1.0. Volume 9: Strategy and Plan Worksheets. Christopher Alberts Audrey Dorofee James Stevens Carol Woody

OCTAVE -S Implementation Guide, Version 1.0. Volume 9: Strategy and Plan Worksheets. Christopher Alberts Audrey Dorofee James Stevens Carol Woody OCTAVE -S Implementation Guide, Version 1.0 Volume 9: Strategy and Plan Worksheets Christopher Alberts Audrey Dorofee James Stevens Carol Woody January 2005 HANDBOOK CMU/SEI-2003-HB-003 Pittsburgh, PA

More information

A Taxonomy for Test Oracles

A Taxonomy for Test Oracles A Taxonomy for Test Oracles Quality Week 98 Douglas Hoffman Software Quality Methods, LLC. 24646 Heather Heights Place Saratoga, California 95070-9710 Phone 408-741-4830 Fax 408-867-4550 Copyright 1998,

More information

Software Testing Life Cycle

Software Testing Life Cycle Software Testing Life Cycle STLC (Software Testing Life Cycle) is an integral component of SDLC (Software Development Life Cycle). Testing has become a distinct phenomenon during and after the development

More information

An Overview of Software Reliability

An Overview of Software Reliability Software Design For Reliability (DfR) Seminar An Overview of Software Reliability Bob Mueller bobm@opsalacarte.com www.opsalacarte.com Software Quality and Software Reliability Related Disciplines, Yet

More information

Supply-Chain Risk Analysis

Supply-Chain Risk Analysis Supply-Chain Risk Analysis Bob Ellison, Chris Alberts, Rita Creel, Audrey Dorofee, and Carol Woody 2010 Carnegie Mellon University Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting

More information

The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects

The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects Presenter: Joseph P. Elm The Software Engineering Institute (SEI) a DoD Research FFRDC Report Documentation

More information

Process Improvement Is Continuous Improvement

Process Improvement Is Continuous Improvement Process Improvement Is Continuous Improvement We can never reach perfection. The CMM does not provide all the answers; it too is evolving and improving. Process management means constructive and continual

More information

Dr. Nader Mehravari Research Scientist, CERT Division

Dr. Nader Mehravari Research Scientist, CERT Division Everything You Always Wanted to Know About Maturity Models Dr. Nader Mehravari Research Scientist, CERT Division Dr. Nader Mehravari is with the CERT Program at the Software Engineering Institute (SEI),

More information

Command and Control Software Development Lessons Learned. Lt Col Michael D. Sarchet Deputy Director, Space Systems Command and Control Division

Command and Control Software Development Lessons Learned. Lt Col Michael D. Sarchet Deputy Director, Space Systems Command and Control Division Command and Control Software Development Lessons Learned Lt Col Michael D. Sarchet Deputy Director, Space Systems Command and Control Division 1 UNCLASSIFIED Agenda Two real world case studies Lessons

More information

Project Report Template (Sem 1)

Project Report Template (Sem 1) 1. Introduction & Problem Statement Project Report Template (Sem 1)

More information

Introduction to software testing and quality process

Introduction to software testing and quality process Introduction to software testing and quality process Automated testing and verification J.P. Galeotti - Alessandra Gorla Engineering processes Engineering disciplines pair construction activities activities

More information

CMMI for Technical Staff

CMMI for Technical Staff CMMI for Technical Staff SES CMMI Training Series April 7, 2009 Audio Conference #: Dial - 1-877-760-2042 Pass code - 147272 SM SEI and CMM Integration are service marks of Carnegie Mellon University CMM

More information

Oracle Systems Optimization Support

Oracle Systems Optimization Support Oracle Systems Optimization Support Oracle Systems Optimization Support offerings provide customers with welldefined packaged services. Let Oracle Advanced Customer Support help you make the most of your

More information

Scrum, Creating Great Products & Critical Systems

Scrum, Creating Great Products & Critical Systems Scrum, Creating Great Products & Critical Systems What to Worry About, What s Missing, How to Fix it Neil Potter The Process Group neil@processgroup.com processgroup.com Version 1.2 1 Agenda Scrum / Agile

More information

Project and Process Tailoring For Success

Project and Process Tailoring For Success Project and Process Tailoring For Success 1 Key Learning Objectives Demonstrate how project/process tailoring can decrease cost by aligning process intensity with project risk and complexity Provide a

More information

Challenges of Managing a Testing Project: (A White Paper)

Challenges of Managing a Testing Project: (A White Paper) Challenges of Managing a Testing Project: () Page 1 of 20 Vinod Kumar Suvarna Introduction Testing is expected to consume 30 50 % of the Project Effort, Still properly managing testing project is not considered

More information

CERTIFIED SOFTWARE QUALITY ENGINEER

CERTIFIED SOFTWARE QUALITY ENGINEER CSQE CERTIFIED SOFTWARE QUALITY ENGINEER Quality excellence to enhance your career and boost your organization s bottom line asq.org/cert Certification from ASQ is considered a mark of quality excellence

More information

Evolutionary Differences Between CMM for Software and the CMMI

Evolutionary Differences Between CMM for Software and the CMMI Evolutionary Differences Between CMM for Software and the CMMI Welcome WelKom Huan Yín Bienvenue Bienvenido Wilkommen????S???S??? Bienvenuto Tervetuloa Välkommen Witamy - 2 Adapting an An Integrated Approach

More information

Integration and Testing

Integration and Testing Integration and Testing 1 Today Software Quality Assurance Integration Test planning Types of testing Test metrics Test tools 2 Deliverables by Phase Possible Deliverables by Phase Concept Document Statement

More information

TSP Performance and Capability Evaluation (PACE): Customer Guide

TSP Performance and Capability Evaluation (PACE): Customer Guide Carnegie Mellon University Research Showcase @ CMU Software Engineering Institute 9-2013 TSP Performance and Capability Evaluation (PACE): Customer Guide William R. Nichols Carnegie Mellon University,

More information

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages 8.0 Test Management Outline 8.1 Test organisation 8.2 Test planning and estimation 8.3 Test program monitoring and control 8.4 Configuration management 8.5 Risk and testing 8.6 Summary Independent Testing

More information

Acquisition & Management Concerns for Agile Use in Government Series. Agile Culture in the DoD

Acquisition & Management Concerns for Agile Use in Government Series. Agile Culture in the DoD 2 Acquisition & Management Concerns for Agile Use in Government Series Agile Culture in the DoD Acquisition & Management Concerns for Agile Use in Government This booklet is part of a series based on material

More information

Software Engineering II - Exercise

Software Engineering II - Exercise Software Engineering II - Exercise April 29 th 2009 Software Project Management Plan Bernd Bruegge Helmut Naughton Applied Software Engineering Technische Universitaet Muenchen http://wwwbrugge.in.tum.de

More information

The Smart Grid Maturity Model & The Smart Grid Interoperability Maturity Model. #GridInterop

The Smart Grid Maturity Model & The Smart Grid Interoperability Maturity Model. #GridInterop The Smart Grid Maturity Model & The Smart Grid Interoperability Maturity Model #GridInterop Maturity Models Dueling or Complementary? SGMM? SGIMM? SGIMM? SGMM? #GridInterop Phoenix, AZ, Dec 5-8, 2011 2

More information

Top 10 Signs You're Ready (or Not)

Top 10 Signs You're Ready (or Not) Top 10 Signs You're Ready (or Not) For an Appraisal Gary Natwick Harris Corporation Gary Natwick - 1 Government Communications Systems Division DoD s Strategic and Business Development CMMI Technology

More information

Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in the Acquisition of Software-Intensive Systems

Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in the Acquisition of Software-Intensive Systems Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in the Acquisition of Software-Intensive Systems John K. Bergey Matthew J. Fisher September 2001 Architecture Tradeoff Analysis Initiative

More information

An Oracle White Paper December Reducing the Pain of Account Reconciliations

An Oracle White Paper December Reducing the Pain of Account Reconciliations An Oracle White Paper December 2012 Reducing the Pain of Account Reconciliations Introduction The finance department in most organizations is coming under increasing pressure to transform and streamline

More information

Agile Quality Management

Agile Quality Management Agile Quality Management Panagiotis Sfetsos, PhD Assistant Professor, Department of Informatics, Alexander Technological Educational Institution E mail: sfetsos@it.teithe.gr Web Page: http://aetos.it.teithe.gr/~sfetsos/

More information

A Guide to the Business Analysis Body of Knowledge (BABOK Guide), Version 2.0 Skillport

A Guide to the Business Analysis Body of Knowledge (BABOK Guide), Version 2.0 Skillport A Guide to the Business Analysis Body of Knowledge (BABOK Guide), Version 2.0 by The International Institute of Business Analysis (IIBA) International Institute of Business Analysis. (c) 2009. Copying

More information

Module 1 Study Guide

Module 1 Study Guide Module 1 Study Guide Introducing PPO Welcome to your Study Guide. This document is supplementary to the information available to you online, and should be used in conjunction with the videos, quizzes and

More information

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2 Passit4Sure.OG0-093.221Questions Number: OG0-093 Passing Score: 800 Time Limit: 120 min File Version: 7.1 TOGAF 9 Combined Part 1 and Part 2 One of the great thing about pass4sure is that is saves our

More information

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials Requirements Analysis and Design Definition Chapter Study Group Learning Materials 2015, International Institute of Business Analysis (IIBA ). Permission is granted to IIBA Chapters to use and modify this

More information

SOFTWARE FAILURE MODES EFFECTS ANALYSIS OVERVIEW

SOFTWARE FAILURE MODES EFFECTS ANALYSIS OVERVIEW SOFTWARE FAILURE MODES EFFECTS ANALYSIS OVERVIEW Copyright, Ann Marie Neufelder, SoftRel, LLC, 2010 amneufelder@softrel.com www.softrel.com This presentation may not be copied in part or whole without

More information

Software Quality Management

Software Quality Management Software Quality Management CONTENTS I. Basic Quality Concepts II. Software Quality Assurance (SQA) 1. Definition of SQA 2. SQA Activities III. Quality Evaluation Standards 1. Six sigma for software 2.

More information

PAYIQ METHODOLOGY RELEASE INTRODUCTION TO QA & SOFTWARE TESTING GUIDE. iq Payments Oy

PAYIQ METHODOLOGY RELEASE INTRODUCTION TO QA & SOFTWARE TESTING GUIDE. iq Payments Oy PAYIQ METHODOLOGY RELEASE 1.0.0.0 INTRODUCTION TO QA & SOFTWARE TESTING GUIDE D O C U M E N T A T I O N L I C E N S E This documentation, as well as the software described in it, is furnished under license

More information

Fall 2014 SEI Research Review. Team Attributes &Team Performance FY14-7 Expert Performance and Measurement

Fall 2014 SEI Research Review. Team Attributes &Team Performance FY14-7 Expert Performance and Measurement Fall 2014 SEI Research Review Team Attributes &Team Performance FY14-7 Expert Performance and Measurement Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Jennifer Cowley

More information

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B 1. Work Plan & IV&V Methodology 1.1 Compass Solutions IV&V Approach The Compass Solutions Independent Verification and Validation approach is based on the Enterprise Performance Life Cycle (EPLC) framework

More information

Demand Management User Guide. Release

Demand Management User Guide. Release Demand Management User Guide Release 14.2.00 This Documentation, which includes embedded help systems and electronically distributed materials (hereinafter referred to as the Documentation ), is for your

More information

ITIL from brain dump_formatted

ITIL from brain dump_formatted ITIL from brain dump_formatted Number: 000-000 Passing Score: 800 Time Limit: 120 min File Version: 1.0 Экзамен A QUESTION 1 Which role is responsible for carrying out the activities of a process? A. Process

More information

USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC)

USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC) USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION Goddard Space Flight Center (GSFC) Sally Godfrey, James Andary, Linda Rosenberg SEPG 2003 2/03 Slide 1 Agenda! Background " NASA Improvement

More information

Risk-Based Testing: Analysis and Strategy. Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009

Risk-Based Testing: Analysis and Strategy. Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009 Risk-Based Testing: Analysis and Strategy Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009 Clyneice Chaney, CMQ/OE, PMP April 21, 2009 Workshop Outline Part I Risk Management

More information

How Business Analysis Can Improve Sales and Marketing Outcomes

How Business Analysis Can Improve Sales and Marketing Outcomes How Business Analysis Can Improve Sales and Marketing Outcomes In today s environment, the strategic focus for most organizations is revenue growth. Almost all executives are searching for ways to drive

More information

Independent Verification and Validation (IV&V)

Independent Verification and Validation (IV&V) Independent Verification and Validation (IV&V) 12 th Annual NDIA CMMI Conference November 2012 - Denver, CO The MITRE Corporation The author s affiliation with The MITRE Corporation is provided for identification

More information

A Research Agenda for Service-Oriented Architecture (SOA): Maintenance and Evolution of Service-Oriented Systems

A Research Agenda for Service-Oriented Architecture (SOA): Maintenance and Evolution of Service-Oriented Systems A Research Agenda for Service-Oriented Architecture (SOA): Maintenance and Evolution of Service-Oriented Systems Grace A. Lewis Dennis B. Smith Kostas Kontogiannis March 2010 TECHNICAL NOTE CMU/SEI-2010-TN-003

More information

Product Line Engineering Lecture PLE Principles & Experiences (2)

Product Line Engineering Lecture PLE Principles & Experiences (2) Product Line Engineering Lecture PLE Principles & Experiences (2) Dr. Martin Becker martin.becker@iese.fraunhofer.de 2 Copyright 2011 Product Line Scoping --- Recap --- Introduction Reuse Approaches Typical

More information

Top 5 Systems Engineering Issues within DOD and Defense Industry

Top 5 Systems Engineering Issues within DOD and Defense Industry Top 5 Systems Engineering Issues within DOD and Defense Industry Task Report July 26-27, 27, 2006 1 Task Description Identify Top 5 Systems Engineering problems or issues prevalent within the defense industry

More information

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide processlabs CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide CMMI-SVC V1.3 Process Areas Alphabetically by Process Area Acronym processlabs CAM - Capacity and Availability Management...

More information

Formulation of a Production Strategy for a Software Product Line

Formulation of a Production Strategy for a Software Product Line Formulation of a Production Strategy for a Software Product Line Gary J. Chastek Patrick Donohoe John D. McGregor August 2009 TECHNICAL NOTE CMU/SEI-2009-TN-025 Research, Technology, and System Solutions

More information

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3) PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3) 3.1 IV&V Methodology and Work Plan 3.1.1 NTT DATA IV&V Framework We believe that successful IV&V is more than just verification that the processes

More information

Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications

Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications Introduction to Modeling and Simulation Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications OSMAN BALCI Professor Copyright Osman Balci Department of Computer

More information

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide processlabs CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide CMMI-DEV V1.3 Process Areas Alphabetically by Process Area Acronym processlabs CAR - Causal Analysis and Resolution...

More information

HP Quality Center 10 Overview

HP Quality Center 10 Overview HP Quality Center 10 Overview Baselining, Versioning and Release Management John Fodeh Solution Architect, Global Testing Practice 2008 Hewlett-Packard Development Company, L.P. The information contained

More information

Usine Logicielle. Position paper

Usine Logicielle. Position paper Philippe Mils: Contact : Thales Resear & Technology Usine Logicielle Project Coordinator philippe.mils@thalesgroup.com Abstract Usine Logicielle Position paper Usine Logicielle is a project operated in

More information

Introduction. Fundamental concepts in testing

Introduction. Fundamental concepts in testing INF 3121 Software Testing - Lecture 01 Introduction. Fundamental concepts in testing 1. Why is testing necessary?? 4. Fundamental test process 5. The psychology of testing 1 1. Why is testing necessary?

More information

SOA Implementation Strategy

SOA Implementation Strategy SOA Implementation Strategy Table of Contents 1 Introduction... 2 2 Stage 1: Establish the SOA Foundation... 4 3 Stage 2: Define Your SOA Strategy... 6 4 Stage 3: Apply, Measure, Iterate... 12 5 Appendix

More information

An Application of Causal Analysis to the Software Modification Process

An Application of Causal Analysis to the Software Modification Process SOFTWARE PRACTICE AND EXPERIENCE, VOL. 23(10), 1095 1105 (OCTOBER 1993) An Application of Causal Analysis to the Software Modification Process james s. collofello Computer Science Department, Arizona State

More information

Agile and CMMI : Disciplined Agile with Process Optimization

Agile and CMMI : Disciplined Agile with Process Optimization www.agiledigm.com Agile and CMMI : Disciplined Agile with Process Optimization Kent Aaron Johnson 02 April 2014 Long Beach, California, USA CMMI is registered in the U.S. Patent and Trademark Office by

More information

Implementing and Managing Open Source Compliance Programs

Implementing and Managing Open Source Compliance Programs Implementing and Managing Open Source Compliance Programs Ibrahim Haddad, Ph.D. VP of R&D, Head of Open Source Twitter: Web: @IbrahimAtLinux IbrahimAtLinux.com Open Source Compliance Summit Yokohama, November

More information

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION 1 Identify applicable United States laws, federal regulations and DoD directives that govern management of IT/SW systems. Identify key statutes related to the acquisition of Information Technology (IT)

More information

VectorCAST Presentation AdaEurope 2017 Advanced safety strategies for DO178C certification Massimo Bombino, MSCE

VectorCAST Presentation AdaEurope 2017 Advanced safety strategies for DO178C certification Massimo Bombino, MSCE VectorCAST Presentation AdaEurope 2017 Advanced safety strategies for DO178C certification Massimo Bombino, MSCE Vector Software, Inc. > Software Quality Overview QUALITY HAZARDS IN AVIONICS INDUSTRY 1.

More information

Copyright Software Engineering Competence Center

Copyright Software Engineering Competence Center Copyright Software Engineering Competence Center 2012 1 Copyright Software Engineering Competence Center 2012 5 These are mapped categories to the waste categories of manufacturing. An excellent overview

More information

IIBA Global Business Analysis Core Standard. A Companion to A Guide to the Business Analysis Body of Knowledge (BABOK Guide) Version 3

IIBA Global Business Analysis Core Standard. A Companion to A Guide to the Business Analysis Body of Knowledge (BABOK Guide) Version 3 IIBA Global Business Analysis Core Standard A Companion to A Guide to the Business Analysis Body of Knowledge (BABOK Guide) Version 3 International Institute of Business Analysis, Toronto, Ontario, Canada.

More information

Critical Requirements Engineering Errors Leads to Fails Software Project

Critical Requirements Engineering Errors Leads to Fails Software Project The Educational Review, USA, 2018, 2(2), 174-180 http://www.hillpublisher.com/journals/er Critical Requirements Engineering Errors Leads to Fails Software Project Muhammed Talha Virtual University of Pakistan

More information

Combining Architecture-Centric Engineering with the Team Software Process

Combining Architecture-Centric Engineering with the Team Software Process Combining Architecture-Centric Engineering with the Team Software Process Robert L. Nord, James McHale, Felix Bachmann December 2010 TECHNICAL REPORT CMU/SEI-2010-TR-031 ESC-TR-2010-031 Research, Technology,

More information

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study RESOURCE: MATURITY LEVELS OF THE CUSTOMIZED CMMI-SVC FOR TESTING SERVICES AND THEIR PROCESS AREAS This resource is associated with the following paper: Assessing the maturity of software testing services

More information

David. Director of Rational Requirements and Quality Management Products

David. Director of Rational Requirements and Quality Management Products David Klavon Director of Rational Requirements and Quality Management Products Quality Management Providing Collaborative Lifecycle Quality Management What s going on? 40% of unplanned downtime is caused

More information

The Evolution of Product Line Assets

The Evolution of Product Line Assets The Evolution of Product Line Assets John D. McGregor June 2003 TECHNICAL REPORT CMU/SEI-2003-TR-005 ESC-TR-2003-005 Pittsburgh, PA 15213-3890 The Evolution of Product Line Assets CMU/SEI-2003-TR-005

More information

9. Verification, Validation, Testing

9. Verification, Validation, Testing 9. Verification, Validation, Testing (a) Basic Notions (b) Dynamic testing. (c) Static analysis. (d) Modelling. (e) Environmental Simulation. (f) Test Strategies. (g) Tool support. (h) Independent Verification

More information

Intermediate Certificate in Software Testing Syllabus. Version 1.4

Intermediate Certificate in Software Testing Syllabus. Version 1.4 Intermediate Certificate in Software Testing Syllabus February 2010 Background This document is the syllabus for the intermediate paper which leads to the practitioner level of qualification, as administered

More information

Two Branches of Software Engineering

Two Branches of Software Engineering ENTERPRISE SOFTWARE ENGINEERING & SOFTWARE ENGINEERING IN THE ENTERPRISE Two Branches of Software Engineering 1 Crafting Software Resource Input Code Debug Product Test 2 Engineering Software Resource

More information

Highlights of CMMI and SCAMPI 1.2 Changes

Highlights of CMMI and SCAMPI 1.2 Changes Highlights of CMMI and SCAMPI 1.2 Changes Presented By: Sandra Cepeda March 2007 Material adapted from CMMI Version 1.2 and Beyond by Mike Phillips, SEI and from Sampling Update to the CMMI Steering Group

More information

Risk-Based Approach to SAS Program Validation

Risk-Based Approach to SAS Program Validation Paper FC04 Risk-Based Approach to SAS Program Validation Keith C. Benze SEC Associates, Inc. 3900 Paramount Parkway, Suite 150 South Morrisville, NC 27560 ABSTRACT SAS is widely used throughout the FDA

More information

EXIN ITIL. Exam Name: Exin ITIL Foundation

EXIN ITIL. Exam Name: Exin ITIL Foundation EXIN ITIL Number: EX0-001 Passing Score: 800 Time Limit: 120 min File Version: 24.5 http://www.gratisexam.com/ Exam Name: Exin ITIL Foundation Exam A QUESTION 1 Which role is responsible for carrying out

More information

Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in Source Selection of Software- Intensive Systems

Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in Source Selection of Software- Intensive Systems Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in Source Selection of Software- Intensive Systems John K. Bergey Matthew J. Fisher Lawrence G. Jones June 2002 Architecture Tradeoff Analysis

More information

CMMI for Services (CMMI-SVC): Current State

CMMI for Services (CMMI-SVC): Current State : Current State Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Eileen Forrester July 2012 What I will cover Explain what the CMMI-SVC is and why we built it Discuss service

More information

MSc Software Testing and Maintenance MSc Prófun og viðhald hugbúnaðar

MSc Software Testing and Maintenance MSc Prófun og viðhald hugbúnaðar MSc Software Testing and Maintenance MSc Prófun og viðhald hugbúnaðar Fyrirlestrar 25 & 26 The SWEBOK Chapter on Software Testing IEEE http://www.swebok.org/ 19/10/2005 Dr Andy Brooks 1 Repeat after Andy:

More information

Analyze, Design, and Develop Applications

Analyze, Design, and Develop Applications Analyze, Design, and Develop Applications On Demand Insurance Problems 1. We lose customers because we process new policy applications too slowly. 2. Our claims processing is time-consuming and inefficient.

More information

Requirements Engineering

Requirements Engineering Requirements Engineering Professor Ray Welland Department of Computing Science University of Glasgow E-mail: ray@dcs.gla.ac.uk The Importance of Requirements Identifying (some) requirements is the starting

More information

Practical Risk Management: Framework and Methods

Practical Risk Management: Framework and Methods New SEI Course! Practical Risk Management: Framework and Methods September 23-24, 2009 Arlington, VA Register at: www.sei.cmu.edu/products/courses/p78.html 1 13 th International Software Product Line Conference

More information

TEST REQUIREMENTS FOR GROUND SYSTEMS

TEST REQUIREMENTS FOR GROUND SYSTEMS SMC Standard SMC-S-024 30 September 2013 ------------------------ Supersedes: New issue Air Force Space Command SPACE AND MISSILE SYSTEMS CENTER STANDARD TEST REQUIREMENTS FOR GROUND SYSTEMS APPROVED FOR

More information

BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE. Yvonne Enselman, CTAL

BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE. Yvonne Enselman, CTAL BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE Yvonne Enselman, CTAL Information alines with ISTQB Sylabus and Glossary THE TEST PYRAMID Why Testing is necessary What is Testing Seven Testing principles

More information

National Aeronautics and Space Administration Washington, DC 20546

National Aeronautics and Space Administration Washington, DC 20546 Technical Standards Division Publication NASA-STD-2100-91 NASA Software Documentation Standard Software Engineering Program NASA-STD-2100-91 -91 Approved: July 29, 1991 National Aeronautics and Space Administration

More information

ORACLE ADVANCED FINANCIAL CONTROLS CLOUD SERVICE

ORACLE ADVANCED FINANCIAL CONTROLS CLOUD SERVICE ORACLE ADVANCED FINANCIAL CONTROLS CLOUD SERVICE Advanced Financial Controls (AFC) Cloud Service enables continuous monitoring of all expense and payables transactions in Oracle ERP Cloud, for potential

More information

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Appendix 1 Formulate Programs with a RAM Growth Program II-1 1.1 Reliability Improvement Policy II-3 1.2 Sample Reliability

More information