Introduction to software testing and quality process

Similar documents
Book Outline. Software Testing and Analysis: Process, Principles, and Techniques

Fundamentals of Dynamic Testing. Inaccuracy in Dynamic Testing

Introduction to Software Testing

R.POONKODI, ASSISTANT PROFESSOR, COMPUTER SCIENCE AND ENGINEERING, SRI ESHWAR COLLEGE OF ENGINEERING, COIMBATORE.

Participation of Testing to Attain a Degree of Quality of Software

Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications

DHANALAKSHMI COLLEGE OF ENGINEERING, CHENNAI DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING IT6004 SOFTWARE ESTING UNIT I : INTRODUCTION

BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE. Yvonne Enselman, CTAL

Systematic Testing#1. (adapted from lecture notes of the CSCI 3060U - Software Quality Assurance unit, J.S. Bradbury, J.R.

On the management of nonfunctional requirements

Testing 2. Testing: Agenda. for Systems Validation. Testing for Systems Validation CONCEPT HEIDELBERG

Testing Close to and Post-Release: System, Acceptance, and Regression Testing

Software Quality Assurance

9. Verification, Validation, Testing

Knowledge Base for Writing Test Cases (Part 1)

Software Engineering

IT6004/ Software Testing Unit-I

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2

B.H. Far

7. Model based software architecture

Building quality into the software from the. Keeping and. the software. software life cycle

DO-178B 김영승 이선아

Lecture 1. In practice, most large systems are developed using a. A software process model is an abstract representation

Importance of Software Testing with Study of Various Testing Techniques & Automation Tools

MTAT : Software Testing

Introduction To Software Testing. Brian Nielsen. Center of Embedded Software Systems Aalborg University, Denmark CSS

ISTQB Sample Question Paper Dump #11

Chapter 6-1: Failure Modes Effect Analysis (FMCEA)

Software Quality Management

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

Subject : Computer Science. Paper : Software Quality Management. Module : Quality Management Activities Module No: CS/SQM/15

IT6004/ Software Testing

Chapter 1: Introduction

An Application of Causal Analysis to the Software Modification Process

Darshan Institute of Engineering & Technology for Diploma Studies Rajkot Unit-1

Bugs are costly... Kinds of Quality Assurance

Global Journal of Engineering Science and Research Management

Software Processes 1

Skill Category 7. Quality Control Practices

Introduction to Software Engineering

Literature Review M.Y. Suhaila, W.K. Wan Mohd Nasir, Member, IAENG, 2011 Raed Shatnawi and Ahmad Alzu bi, IMECS 2011

Oi Requirements Communication in New Product Development

Chapter 6. Software Quality Management & Estimation

A Holistic Qualitative Approach to Software Reliability

Software engineering Facts. CSC Compiler Construction Software Engineering Topics. What is software engineering? What is software?

Vector Software. Understanding Verification and Validation of software under IEC :2010 W H I T E P A P E R

Test Management: Part I. Software Testing: INF3121 / INF4121

Core Design Requirements At the start of a project, it is important to specify those parameters and properties that:

Testing throughout the software life cycle. Software Testing: INF3121 / INF4121

Research on software systems dependability at the OECD Halden Reactor Project

Chapter 26. Quality Management

Measuring and Assessing Software Quality

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages

Testing. CxOne Standard

Work Plan and IV&V Methodology

This document is copyrighted, the distribution of this document is punishable by law.

Lectures 2 & 3. Software Processes. Software Engineering, COMP201 Slide 1

The Road from Software Testing to Theorem Proving

Introduction. Fundamental concepts in testing

Space Product Assurance

Chapter 4 Software Process and Project Metrics

Introduction to Software Engineering

WORKING WITH TEST DOCUMENTATION

7. What is planning? It is an act of formulating a program for a definite course of action. Planning is to decide what is to be done.

System Verification Helps Validate Complex Integrated Systems. American Bureau of Shipping (ABS) RELIABILITY SESSION.

CMSC 435: Software Engineering Section Back to Software. Important: Team Work. More Resources

AIRBORNE SOFTWARE VERIFICATION FRAMEWORK AIMED AT AIRWORTHINESS

Chapter 3. Information Systems Development. McGraw-Hill/Irwin. Copyright 2007 by The McGraw-Hill Companies, Inc. All rights reserved.

Personal Software Process SM for Engineers: Part I

Requirements Engineering: Part I. Software Requirements & Project Management CITS3220

Based on Software Engineering, by Ian Sommerville Coherent sets of activities for specifying, designing, implementing and testing software systems

Software Development Life Cycle (SDLC) Tata Consultancy Services ltd. 12 October

Basics of Software Engineering. Carmen Navarrete

Building High Assurance Systems with SAFe 4.0

Disciplined Software Testing Practices

Complex Systems of Systems (CSOS) : Software Benefits,Risks,and Strategies

Sample Reliability Language for DoD Acquisition Contracts

Transforming Business Needs into Business Value. Path to Agility May 2013

A Systematic Approach to Performance Evaluation

IMPLEMENTATION, EVALUATION & MAINTENANCE OF MIS:

Benchmarking Functional Verification by Mike Bartley and Mike Benjamin, Test and Verification Solutions

Software Testing(TYIT) Software Testing. Who does Testing?

Software Modeling & Analysis

Building a Business Intelligence Career

KNOWLEDGE AREA: SOFTWARE QUALITY

CHAPTER 2 LITERATURE SURVEY


Next Generation Design and Verification Today Requirements-driven Verification Methodology (for Standards Compliance)

Test Strategies Around the World Winning the War on Bugs Through Strategy

Verification and Validation of Embedded Systems The good, the bad, the ordinary

Pertemuan 2. Software Engineering: The Process

MSc Software Testing and Maintenance MSc Prófun og viðhald hugbúnaðar

version NDIA CMMI Conf 3.5 SE Tutorial RE - 1

Quality Assessment Method for Software Development Process Document based on Software Document Characteristics Metric

Brief Summary of Last Lecture. Model checking of timed automata: general approach

The Verification Company. Software Development and Verification compliance to DO-178C/ED-12C

T Software Testing and Quality Assurance Test Planning

INF 3121 Software Testing - Lecture 05. Test Management

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Led by the Author Attended by a peer group Varying level of formality Knowledge gathering Defect finding

Transcription:

Introduction to software testing and quality process Automated testing and verification J.P. Galeotti - Alessandra Gorla

Engineering processes Engineering disciplines pair construction activities activities that check intermediate and final products Software engineering is no exception: construction of high quality software requires construction and verification activities

Peculiarities of software Software has some characteristics that make V&V particularly difficult: Example Many different quality requirements Evolving (and deteriorating) structure Inherent non-linearity Uneven distribution of faults If an elevator can safely carry a load of 1000 kg, it can also safely carry any smaller load; If a procedure correctly sorts a set of 256 elements, it may fail on a set of 255 or 53 or 12 elements, as well as on 257 or 1023.

Impact of new technologies Advanced development technologies can reduce the frequency of some classes of errors but do not eliminate errors New development approaches can introduce new kinds of faults Memory management improved in java in respect to c new problems due to the use of polymorphism, dynamic binding and private state in object-oriented software.

Variety of approaches There are no fixed recipes Quality managers must choose and schedule the right blend of techniques to reach the required level of quality within cost constraints design a specific solution that suits the problem the requirements the development environment

Five Basic Questions When do verification and validation start? When are they complete? What particular techniques should be applied during development? How can we assess the readiness of a product? How can we control the quality of successive releases? How can the development process itself be improved?

1: When do V&V start? When are they complete? Test is not a (late) phase of software development Execution of tests is a small part of the verification and validation process V&V start as soon as we decide to build a software product, or even before V&V last far beyond the product delivery as long as the software is in use, to cope with evolution and adaptations to new conditions

Early start: from feasibility study The feasibility study of a new project must take into account the required qualities and their impact on the overall cost At this stage, quality related activities include risk analysis measures needed to assess and control quality at each stage of development. assessment of the impact of new features and new quality requirements contribution of quality control activities to development cost and schedule.

Long lasting: beyond maintenance Maintenance activities include analysis of changes and extensions generation of new test suites for the added functionalities re-executions of tests to check for non regression of software functionalities after changes and extensions fault tracking and analysis

2: What particular techniques should be applied during development? No single A&T technique can serve all purposes The primary reasons for combining techniques are: Effectiveness for different classes of faults example: analysis instead of testing for race conditions Applicability at different points in a project example: inspection for early requirements validation Differences in purpose example: statistical testing to measure reliability Tradeoffs in cost and assurance example: expensive technique for key properties

Plan and Monitor Requirements Elicitation Identify qualites Requirements Specification Plan acceptance test Plan system test Architectural Design Detailed Design Plan unit & integration test Unit Coding Monitor the A&T process Integration & Delivery Maintenance Verify Specifications Validate specifications Analyze architectural design Inspect architectural design Inspect detailed design Code inspection Generate Test Cases Generate system test Generate integration test Generate unit test Generate regression test Update regression test Execute Test Cases and Validate Software Improve Process Design scaffolding Design oracles Execute unit test Analyze coverage Generate structural test Execute integration test Execute system test Execute acceptance test Execute regression test Collect data on faults Analyze faults and improve the process

3: How can we assess the readiness of a product? A&T during development aim at revealing faults We cannot reveal and remove all faults A&T cannot last indefinitely: we want to know if products meet the quality requirements We must specify the required level of dependability and determine when that level has been attained.

Different measures of dependability Availability measures the quality of service in terms of running versus down time Mean time between failures (MTBF) measures the quality of the service in terms of time between failures Reliability indicates the fraction of all attempted operations that complete successfully

Assessing dependability Randomly generated tests following an operational profile Alpha test: tests performed by users in a controlled environment, observed by the development organization Beta test: tests performed by real users in their own environment, performing actual tasks without interference or close monitoring

4: How can we control the quality of successive releases? Software test and analysis does not stop at the first release. Software products operate for many years, and undergo many changes: They adapt to environment changes They evolve to serve new and changing user requirements. Quality tasks after delivery test and analysis of new and modified code re-execution of system tests extensive record-keeping

5: How can the development process itself be improved? The same defects are encountered in project after project A third goal of the improving the quality process is to improve the process by identifying and removing weaknesses in the development process identifying and removing weaknesses in test and analysis that allow them to remain undetected

A four step process to improve fault analysis and process Define the data to be collected and implement procedures for collecting them Analyze collected data to identify important fault classes Analyze selected fault classes to identify weaknesses in development and quality measures Adjust the quality and development process

Summary The quality process has three different goals: Improving a software product assessing the quality of the software product improving the quality process We need to combine several A&T techniques through the software process A&T depend on organization and application domain. Cost-effectiveness depends on the extent to which techniques can be re-applied as the product evolves. Planning and monitoring are essential to evaluate and refine the quality process.

Dimensions and tradeoffs of testing and analysis techniques

Verification and validation Validation: does the software system meets the user's real needs? are we building the right software? Verification: does the software system meets the requirements specifications? are we building the software right?

Validation and Verification Actual Requirements SW Specs System Validation Includes usability testing, user feedback Verification Includes testing, inspections, static analysis

Verification or validation depends on the specification Unverifiable (but validatable) spec:... if a user presses a request button at floor i, an available elevator must arrive at floor i soon... 1 2 3 4 5 6 7 8 Example: elevator response Verifiable spec:... if a user presses a request button at floor i, an available elevator must arrive at floor i within 30 seconds...

Validation and Verification Activities Actual Needs and Constraints User Acceptance (alpha, beta test) Delivered Package Review System Specifications Analysis / Review System Test System Integration Subsystem Design/Specs Integration Test Analysis / Review Subsystem verification Unit/Component Specs Module Test Unit/ Components validation User review of external behavior as it is determined or becomes visible

ever You can t always get what you want Correctness properties are undecidable the halting problem can be embedded in almost every property of interest Property Program Decision Procedure Pass/Fail

Getting what you need... Theorem proving: Unbounded effort to verify general properties. Perfect verification of arbitrary properties by logical proof or exhaustive testing (Infinite effort) optimistic inaccuracy: we may accept some programs that do not possess the property (i.e., it may not detect all violations). Model checking: Decidable but possibly intractable checking of simple temporal properties. testing Precise analysis of simple syntactic properties. Data flow analysis Typical testing techniques pessimistic inaccuracy: it is not guaranteed to accept a program even if the program does possess the property being analyzed automated program analysis techniques Simplified properties Optimistic inaccuracy simplified properties: reduce the degree of freedom for simplifying the property to check Pessimistic inaccuracy

Summary Most interesting properties are undecidable, thus in general we cannot count on tools that work without human intervention Assessing program qualities comprises two complementary sets of activities: validation (does the software do what it is supposed to do?) and verification (does the system behave as specified?) There is no single technique for all purposes: test designers need to select a suitable combination of techniques

Some important principles

Main A&T Principles General engineering principles: Partition: divide and conquer Visibility: making information accessible Feedback: tuning the development process Specific A&T principles: Sensitivity: better to fail every time than sometimes Redundancy: making intentions explicit Restriction: making the problem easier

Sensitivity: better to fail every time than sometimes Consistency helps: a test selection criterion works better if every selected test provides the same result, i.e., if the program fails with one of the selected tests, it fails with all of them (reliable criteria) run time deadlock analysis works better if it is machine independent, i.e., if the program deadlocks when analyzed on one machine, it deadlocks on every machine

Redundancy: making intentions explicit Redundant checks can increase the capabilities of catching specific faults early or more efficiently. Static type checking is redundant with respect to dynamic type checking, but it can reveal many type mismatches earlier and more efficiently. Validation of requirement specifications is redundant with respect to validation of the final software, but can reveal errors earlier and more efficiently. Testing and proof of properties are redundant, but are often used together to increase confidence

Partition: divide and conquer Hard testing and verification problems can be handled by suitably partitioning the input space: both structural and functional test selection criteria identify suitable partitions of code or specifications (partitions drive the sampling of the input space) verification techniques fold the input space according to specific characteristics, grouping homogeneous data together and determining partitions

Restriction: making the problem easier Suitable restrictions can reduce hard (unsolvable) problems to simpler (solvable) problems A weaker spec may be easier to check: it is impossible (in general) to show that pointers are used correctly, but the simple Java requirement that pointers are initialized before use is simple to enforce. A stronger spec may be easier to check: it is impossible (in general) to show that type errors do not occur at run-time in a dynamically typed language, but statically typed languages impose stronger restrictions that are easily checkable.

Visibility: Judging status The ability to measure progress or status against goals X visibility = ability to judge how we are doing on X, e.g., schedule visibility = Are we ahead or behind schedule, quality visibility = Does quality meet our objectives? Involves setting goals that can be assessed at each stage of development The biggest challenge is early assessment, e.g., assessing specifications and design with respect to product quality Related to observability Example: Choosing a simple or standard internal data format to facilitate unit testing

Feedback: tuning the development process Learning from experience: Each project provides information to improve the next Examples Checklists are built on the basis of errors revealed in the past Error taxonomies can help in building better test selection criteria Design guidelines can avoid common pitfalls

Summary The discipline of test and analysis is characterized by 6 main principles: Sensitivity: better to fail every time than sometimes Redundancy: making intentions explicit Restriction: making the problem easier Partition: divide and conquer Visibility: making information accessible Feedback: tuning the development process They can be used to understand advantages and limits of different approaches and compare different techniques

Test and Analysis Activities within a Software Process

Software Qualities and Process Qualities cannot be added after development Quality results from a set of inter-dependent activities Analysis and testing are crucial but far from sufficient. Testing is not a phase, but a lifestyle Testing and analysis activities occur from early in requirements engineering through delivery and subsequent evolution. Quality depends on every part of the software process An essential feature of software processes is that software test and analysis is thoroughly integrated and not an afterthought

The Quality Process Quality process: set of activities and responsibilities focused primarily on ensuring adequate dependability concerned with project schedule or with product usability The quality process provides a framework for selecting and arranging activities considering interactions and trade-offs with other important goals.

Interactions and tradeoffs example: high dependability vs. time to market Mass market products: better to achieve a reasonably high degree of dependability on a tight schedule than to achieve ultra-high dependability on a much longer schedule Critical medical devices: better to achieve ultra-high dependability on a much longer schedule than a reasonably high degree of dependability on a tight schedule

Properties of the Quality Process Completeness: Appropriate activities are planned to detect each important class of faults. Timeliness: Faults are detected at a point of high leverage (as early as possible) Cost-effectiveness: Activities are chosen depending on cost and effectiveness cost must be considered over the whole development cycle and product life the dominant factor is usually the cost of repeating an activity through many change cycles.

Planning and Monitoring The quality process Balances several activities across the whole development process Selects and arranges them to be as cost-effective as possible Improves early visibility Quality goals can be achieved only through careful planning Planning is integral to the quality process

Process Visibility A process is visible to the extent that one can answer the question How does our progress compare to our plan? Example: Are we on schedule? How far ahead or behind? The quality process has not achieved adequate visibility if one cannot gain strong confidence in the quality of the software system before it reaches final testing quality activities are usually placed as early as possible design test cases at the earliest opportunity (not ``just in time'') uses analysis techniques on software artifacts produced before actual code. motivates the use of proxy measures Ex: the number of faults in design or code is not a true measure of reliability, but we may count faults discovered in design inspections as an early indicator of potential quality problems

A&T Strategy Identifies company- or project-wide standards that must be satisfied procedures required, e.g., for obtaining quality certificates techniques and tools that must be used documents that must be produced

A&T Plan A comprehensive description of the quality process that includes: objectives and scope of A&T activities documents and other items that must be available items to be tested features to be tested and not to be tested analysis and test activities staff involved in A&T constraints pass and fail criteria schedule deliverables hardware and software requirements risks and contingencies

Quality Goals Process qualities (visibility,...) Product qualities internal qualities (maintainability,...) external qualities usefulness qualities: usability, performance, security, portability, interoperability dependability correctness, reliability, safety, robustness

Dependability Qualities Correctness: A program is correct if it is consistent with its specification seldom practical for non-trivial systems Reliability: likelihood of correct function for some ``unit'' of behavior relative to a specification and usage profile statistical approximation to correctness (100% reliable = correct) Safety: preventing hazards Robustness acceptable (degraded) behavior under extreme conditions

Example of Dependability Qualities 11 12 1 10 2 9 3 8 4 7 6 5 Correctness and reliability: let traffic pass according to correct pattern and central scheduling Robustness and safety: provide degraded function when possible; never signal conflicting greens

Relation among Dependability Qualities Reliable but not correct: failures can occur rarely Robust but not safe: catastrophic failures can occur Correct but not safe: the specification is inadequate Safe but not correct: annoying failures can occur

Analysis analysis includes manual inspection techniques automated analyses can be applied at any development stage particularly well suited at the early stages of specifications and design

Inspection can be applied to essentially any document requirements statements architectural and detailed design documents test plans and test cases program source code may also have secondary benefits spreading good practices instilling shared standards of quality. takes a considerable amount of time re-inspecting a changed component can be expensive used primarily where other techniques are inapplicable where other techniques do not provide sufficient coverage

Automatic Static Analysis More limited in applicability can be applied to some formal representations of requirements models not to natural language documents are selected when available substituting machine cycles for human effort makes them particularly costeffective.

Testing Executed late in development Start as early as possible Early test generation has several advantages Tests generated independently from code, when the specifications are fresh in the mind of analysts The generation of test cases may highlight inconsistencies and incompleteness of the corresponding specifications tests may be used as compendium of the specifications by the programmers

Improving the Process Long lasting errors are common It is important to structure the process for Identifying the most critical persistent faults tracking them to frequent errors adjusting the development and quality processes to eliminate errors Feedback mechanisms are the main ingredient of the quality process for identifying and removing errors

Organizational factors Different teams for development and quality? separate development and quality teams is common in large organizations indistinguishable roles is postulated by some methodologies (extreme programming) Different roles for development and quality? test designer is a specific role in many organizations mobility of people and roles by rotating engineers over development and testing tasks among different projects is a possible option

Example of Allocation of Responsibilities Allocating tasks and responsibilities is a complex job: Unit testing to the development team (requires detailed knowledge of the code) but the quality team may control the results (structural coverage) Integration, system and acceptance testing to the quality team but the development team may produce scaffolding and oracles Inspection and walk-through to mixed teams Regression testing to quality and maintenance teams Process improvement related activities to external specialists interacting with all teams

Allocation of Responsibilities and rewarding mechanisms: case A allocation of responsibilities Development team responsible development measured with LOC per person month Quality team responsible for quality possible effect Development team tries to maximize productivity, without considering quality Quality team will not have enough resources for bad quality products result product of bad quality and overall project failure

Allocation of Responsibilities and rewarding mechanisms: case B allocation of responsibilities Development team responsible for both development and quality control possible effect the problem of case A is solved but the team may delay testing for development without leaving enough resources for testing result delivery of a not fully tested product and overall project failure

Summary Test and Analysis are complex activities that must be suitably planned and monitored A good quality process obeys some basic principles: visibility early activities feedback aims at reducing occurrences of faults assessing the product dependability before delivery improving the process