Disciplined Software Testing Practices

Similar documents
Surviving the Top Ten Challenges of Software Testing

Challenges of Managing a Testing Project: (A White Paper)

Testing 2. Testing: Agenda. for Systems Validation. Testing for Systems Validation CONCEPT HEIDELBERG

GENERAL PRINCIPLES OF SOFTWARE VALIDATION

Brief Summary of Last Lecture. Model checking of timed automata: general approach

ISTQB CTFL BH0-010 Exam Practice Question Paper

ISTQB CTFL BH QuestionsAnswers with Explanation

Summary of TL 9000 R4.0 Requirements Beyond ISO 9001:2000

ISTQB Sample Question Paper Dump #11

QuEST Forum. TL 9000 Quality Management System. Requirements Handbook

Requirements Validation and Negotiation

0 Introduction Test strategy A Test Strategy for single high-level test B Combined testing strategy for high-level tests...

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2

Software Development Life Cycle (SDLC) Tata Consultancy Services ltd. 12 October

David. Director of Rational Requirements and Quality Management Products

SE420 Software Quality Assurance

Project Report Template (Sem 1)

ISEB ISTQB Sample Paper

ISTQB CTFL BH0-010 Exam Practice Question Paper

SE420 Software Quality Assurance

Basic Black Box Testing Techniques

An Overview of Software Reliability

Software Quality Factors

Developing Software Quality Plans a Ten Step Process. Phil Robinson Lonsdale Systems. Software Quality Plans. We all agree that you need one

Introduction to software testing and quality process

Testing. CxOne Standard

Intermediate Certificate in Software Testing Syllabus. Version 1.4

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages

Software Quality. A Definition of Quality. Definition of Software Quality. Definition of Implicit Requirements

SOFTWARE TESTING REVEALED

Book Outline. Software Testing and Analysis: Process, Principles, and Techniques

Testing Close to and Post-Release: System, Acceptance, and Regression Testing

International SEMATECH Semiconductor Industry Standards Conformance Guidelines: Assessment Criteria and Processes, Rev. 2

Integration and Testing

Digital Industries Apprenticeship: Occupational Brief. Software Tester. March 2016

Cost-Effective Verification and Validation of Modeling and Simulation

R214 SPECIFIC REQUIREMENTS: INFORMATION TECHNOLOGY TESTING LABORATORY ACCREDITATION PROGRAM

MSc Software Testing and Maintenance MSc Prófun og viðhald hugbúnaðar

9. Verification, Validation, Testing

Compliance driven Integrated circuit development based on ISO26262

Introduction. Fundamental concepts in testing

Number: DI-IPSC-81427B Approval Date:

PLM APPLICATION TESTING

REQUIREMENT DRIVEN TESTING. Test Strategy for. Project name. Prepared by <author name> [Pick the date]

13. Back-End Design Flow for HardCopy Series Devices

Software Quality Management

Vector Software. Understanding Verification and Validation of software under IEC :2010 W H I T E P A P E R

Engineering systems to avoid disasters

Business Information Systems. Decision Making and Problem Solving. Figure Chapters 10 & 11

ISTQB Advanced Technical Test Analyst Certificate in Software Testing

Requirements Gathering using Object- Oriented Models

Software Testing Life Cycle

Introduction to Software Engineering

Introduction to Software Testing

COMP 474 Software Engineering

ISTQB CTFL BH0-10 Questions Answers with Explanation

Development of AUTOSAR Software Components with Model-Based Design

Magillem. X-Spec. For embedded Software and Software-driven verification teams

Building quality into the software from the. Keeping and. the software. software life cycle

JOURNAL OF OBJECT TECHNOLOGY

A lifecycle approach to systems quality: because you can t test in quality at the end.

Surviving the Top Ten Challenges of Software Test Automation

Testing and Quality Assurance Techniques

The software lifecycle and its documents

Ingegneria del Software II academic year: Course Web-site: [

A Guide to the Business Analysis Body of Knowledge (BABOK Guide), Version 2.0 Skillport

What are requirements? Basics of Requirement Engineering. Definition of a Stakeholder. Stated Vs. Real Requirements. Stated Vs.

This topic focuses on how to prepare a customer for support, and how to use the SAP support processes to solve your customer s problems.

The Impact of Interoperability Testing. Paul Blackett

Functional Testing. CSCE Lecture 4-01/21/2016

Learning Technology Implementation Guide: econtent Development and Integration

Sample Exam ISTQB Agile Foundation Questions. Exam Prepared By

Lesson 31- Non-Execution Based Testing. October 24, Software Engineering CSCI 4490

Secure Integration of the PersoApp-Open-Source-Library

Chapter 8. Systems Development. Ralph M. Stair George W. Reynolds

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

COMMITMENT. Software Quality for Non-Software Professionals

A View on Tool Interoperability Solutions at Ford Motor Company

Validation of Calibration Software by the Calibration Facility

Managing System Performance

Testing a Software Product Line

Cost of Changing the Activities in SDLC. Minimum of Cost at this level. code debuging unit test integration. Activity

Software Quality Management

ALM 11 Training. Material contained in this document is priority to Northway Solutions Group.

PERSPECTIVE. Creating Business Value with Mature QA Practices. Abstract

Joined-up Requirements: Business Goals to System Tests

Common System and Software Testing Pitfalls

Subject : Computer Science. Paper : Software Quality Management. Module : Quality Management Activities Module No: CS/SQM/15

Key Takeaways: 1. How to make your Exploratory testing sessions more effective so that you achieve customer value

CERTIFIED SOFTWARE QUALITY ENGINEER

Functional Safety: ISO26262

A TEAM-BASED PROJECT QUALITY MANAGEMENT SYSTEM

CHAPTER 2 LITERATURE SURVEY

Successful Service Virtualization

Equifax InterConnect. A Product Review. By James Taylor CONTENTS

Significance of Quality Metrics during Software Development Process

Emerging Paradigms in Testing. Refocus your view of testing to drive superior results

BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE. Yvonne Enselman, CTAL

GAIA. GAIA Software Product Assurance Requirements for Subcontractors. Name and Function Date Signature 15/09/05 15/09/05 15/09/05 15/09/05 15/09/05

Sistemi ICT per il Business Networking

Transcription:

isciplined oftware Testing Practices r. Magdy Hanna Chairman International Institute for oftware Testing ponsored by: International Institute for oftware Testing International Institute for oftware Testing, 1996-2003 isciplined Practices-1

Practice #1 Requirements are Crucial for Good Testing You can t test what you do not know Users will always change their minds Requirement precision and completeness is a must The use of prototyping or RA is NOT an excuse for not documenting requirements Keep in mind: GUI describes only high level functional requirements; no details; no business rules; no quality requirements Rule of thumb: Users do not know what they want until they do not get it International Institute for oftware Testing, 1996-2003 isciplined Practices-2

Practice #2 Test for Both Functional and Quality Requirements Although functional requirements seem to be most important to the user, most software disasters result from poor quality software. Quality requirements are the least understood requirements by both customers and developers. Although never talked about, quality requirements seem to be an assumed customer expectation. You can t achieve quality unless you define it. You can t test for quality unless you quantify it. International Institute for oftware Testing, 1996-2003 isciplined Practices-3

The ixteen Fitness-for-Use Factors (1) 1. Configurability the ability to configure the software for user s convenience. Examples are changing the system-user interface to use certain graphical symbols or changing the default use of directories. 2. Correctness the degree to which the software conforms to the user s functional requirements. 3. Efficiency the amount of resources required by the system to perform its intended functionality. This includes processing time, memory, disks, or communication lines. 4. Expandability the ability to change the software to add more functionality or to improve its performance. This deals with the perfective maintenance of the software International Institute for oftware Testing, 1996-2003 isciplined Practices-4

The ixteen Fitness-for-Use Factors (2) 5. Flexibility the ability to change the software to function in a different environment. This includes working under different database structure or to execute in a context different from that considered in the original implementation. This deals with the adoptive maintenance of the software. 6. Integrity the ability of the software to protect itself and its data from overt and covert access. 7. Interoperability the ability of the software to exchange data with other software 8. Maintainability the ability to change the software to fix errors. This deals with the corrective maintenance of the software. International Institute for oftware Testing, 1996-2003 isciplined Practices-5

The ixteen Fitness-for-Use Factors (3) 9. Manageability the ability to properly manage the administrative aspects of the software. This includes resource allocation, configuration management, etc. It also deals with the availability of tools to support manageability. 10. Portability the ability of the software to function on different platforms. 11. Reliability the rate of failures in the software that make it unable to deliver its intended functionality. 12. Reusability the ability to reuse portions of the software in other applications. International Institute for oftware Testing, 1996-2003 isciplined Practices-6

The ixteen Fitness-for-Use Factors (4) 13. afety the ability of the software to perform its functionality without causing any unsafe conditions. 14. urvivability the ability of the software to continue execution, even with degraded functionality, after a software or hardware failure. 15. Usability the ease by which the software can be learned and used. 16. Verifiability the ease by which it can be verified that the software is working correctly. International Institute for oftware Testing, 1996-2003 isciplined Practices-7

Practice #3 Adopt Model-Based Requirements Natural language text is not a reliable way of specifying requirements. Ambiguous Vague Incomplete ifficult to communicate and understand Not easily testable International Institute for oftware Testing, 1996-2003 isciplined Practices-8

Models for pecifying Requirements Use any of the following models to augment natural language text: ata Models Process Models Object Models tate Models User Interface Models ecision Tables ecision Trees Use Cases These will be covered in other courses! International Institute for oftware Testing, 1996-2003 isciplined Practices-9

Practice #4 Formally efine Test Cases For each requirement, identify all possible scenarios. Consider input data, internal state, and external events Use techniques such as Equivalence Classes Cause and Effect graphs ecision Tables ecision Trees Use Cases tate Models For each scenario, determine all test cases using techniques such as Boundary Value Conditions International Institute for oftware Testing, 1996-2003 isciplined Practices-10

Ad Hoc (Informal) Testing Unplanned testing is bad if it is your only strategy! Monkey testing Beating the keys Just try a little of this mentality Can you accept accidental quality? What functional areas did you cover? What is the risk of shipping this build? What is the general quality of the W? Remember that one output of the test effort is gaining knowledge [measure the quality] of the W. International Institute for oftware Testing, 1996-2003 isciplined Practices-11

Practice #5: Perform Positive and Negative Testing Positive Testing is any activity aimed at evaluating an attribute or a capability of a system and determine that it meets its requirements. Expectation is that data inputted is valid and will be processed through the normal paths. Negative Testing is the process of executing a program or system with the intent of finding errors. Expectation is that data inputted is invalid and will be processed through the error handling paths. International Institute for oftware Testing, 1996-2003 isciplined Practices-12

Practice #6 Trace Requirements to Test Components (Test Cases or Test cenarios) Test Case 1 Test Case 2 Test Case 3 Test Case 4 Req 1 Req 2 Req 3 Req 4 Passed/ Passed/ X Failed Failed Passed/ Passed/ Failed Failed Passed/ X Failed Passed/ X Failed Test Case n International Institute for oftware Testing, 1996-2003 isciplined Practices-13

Practice #7 Trace Test Cases to atabase Tables TC 1 TC 2 TC 3 TC 4 TC n Table 1 U U Table 2 R R C C Table 3 U R R Table 4 Table 5 Table 6 R/U Table n C Create R Read U Update elete International Institute for oftware Testing, 1996-2003 isciplined Practices-14

Practice #8 Perform More Thorough Regression Testing Base your regression test suite on: Impact analysis Risk analysis Use of capture/playback tools can help Remember: Current users of the W can be economically harmed if the new version does not give them the functionality they already depend on! International Institute for oftware Testing, 1996-2003 isciplined Practices-15

Perform Impact Analysis When a requirement is changed, what components are likely to be affected? When a component is changed, what requirements need to be retested? Impact analysis starts in development by programmers and continues during system testing International Institute for oftware Testing, 1996-2003 isciplined Practices-16

Impact Analysis Req 1 Req 2 Req 3 Req 4 Req n Component 1 X X Component 2 X X Component 3 X X X Component 4 Component n International Institute for oftware Testing, 1996-2003 isciplined Practices-17

Practice #9 efine Testing As a Process NOT As a Lifecycle Phase Requirements esign Coding???? ystem Testing Maintenance International Institute for oftware Testing, 1996-2003 isciplined Practices-18

The Common ense Model for oftware evelopment and Testing The Requirement Management Process Test Planning Test esign Test Execution Product Release Integrated Builds Bug Reports Project Planning Architectural and etailed esign Programming and Unit Testing ystem Integration International Institute for oftware Testing, 1996-2003 isciplined Practices-19

Test Planning Activities pecify test environment pecify scope and objectives pecify your approach to systems test Perform risk analysis etermine staffing requirements and responsibilities etermine any hardware, software, and network resources required for systems test ecide on tools to be used and determine training needs efine tasks and their sequence International Institute for oftware Testing, 1996-2003 isciplined Practices-20

Test esign Activities efine test scenarios and test cases Identify and creating test data etermine expected results for each test efine test procedures to conduct each test etermine the test cycles and their sequence Write any programs or command control language needed to download data to the test environment et up the test environment Write and test any programs or command control language needed to compare expected results with actual results Write and test any programs or command control language needed to conduct the tests International Institute for oftware Testing, 1996-2003 isciplined Practices-21

Practice # 10 elect Tools to upport Your Test Process Test planning tools Test management tools Test case design tools Test coverage tools Test execution and record/playback tools tatic analysis tools efect tracking tools International Institute for oftware Testing, 1996-2003 isciplined Practices-22

Practice #11 Perform Both ynamic and tatic Testing ynamic testing is the process of executing a program or system with the intent of finding errors or making sure that the system meets its intended requirements tatic testing is any activity that aims at finding defects by inspecting, reviewing and analyzing any static form of the software (requirement models, design models, code, test plans and test design specification) International Institute for oftware Testing, 1996-2003 isciplined Practices-23

Balance Between ynamic and tatic Testing : efects found during tatic Testing : efects found during ynamic Testing Many defects that are normally left for dynamic testing can be found early through static testing International Institute for oftware Testing, 1996-2003 isciplined Practices-24

Practice #12 Continuing Formal Education Testing W is an engineering discipline On-the-job training can only go so far Tester Certification is one way Certified oftware Testing Professional (CTP) Covers the areas a well-rounded tester must know evelop a sense of professionalism Help change the perception of testers as nonprofessionals Meet tomorrow s challenges ee the body of knowledge document in appendix International Institute for oftware Testing, 1996-2003 isciplined Practices-25

CTP Body of Knowledge 1. Principles of oftware Testing Levels of Testing Testing client/server applications Testing Internet and web applications Testing object-oriented applications Testing embedded systems The testing life cycle 2. Test esign Code-based test case design techniques Requirement-based test case design techniques Test design specification International Institute for oftware Testing, 1996-2003 isciplined Practices-26

CTP Body of Knowledge 3. Managing the Testing Process Planning cheduling Reporting Resources Risk Management Measuring and improving the test process International Institute for oftware Testing, 1996-2003 isciplined Practices-27

CTP Body of Knowledge 4. Test Execution and defect tracking Test scripting Reporting efect tracking 5. Requirement efinitions, Refinement and Verification Writing testable requirements Exploring requirements Refining requirements efining requirements Requirement verification Requirement tractability International Institute for oftware Testing, 1996-2003 isciplined Practices-28

CTP Body of Knowledge 6. Test automation Tool evaluation and selection Architectures Automation standards and guidelines Planning the test automation process Automation team roles 7. tatic Testing (Inspections, Reviews, and Walkthroughs) Types of static testing The process of static testing efect data analysis Improving the process International Institute for oftware Testing, 1996-2003 isciplined Practices-29