Testing 2. Testing: Agenda. for Systems Validation. Testing for Systems Validation CONCEPT HEIDELBERG

Similar documents
BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE. Yvonne Enselman, CTAL

ISTQB Sample Question Paper Dump #11

Integration and Testing

INF 3121 Software Testing - Lecture 05. Test Management

Software Testing Life Cycle

Testing. CxOne Standard

COMPUTERISED SYSTEMS

CUSTOMER AND SUPPLIER ROLES AND RESPONSIBILITIES FOR 21 CFR 11 COMPLIANCE ASSESSMENT. 21 CFR Part 11 FAQ. (Frequently Asked Questions)

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK

GENERAL PRINCIPLES OF SOFTWARE VALIDATION

REQUIREMENT DRIVEN TESTING. Test Strategy for. Project name. Prepared by <author name> [Pick the date]

Regulatory Overview Annex 11 and Part 11. Sion Wyn Conformity +[44] (0)

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2

Functional requirements and acceptance testing

Software Development Life Cycle (SDLC) Tata Consultancy Services ltd. 12 October

Work Plan and IV&V Methodology

CRM System Tester. Location London Department Supporter and Community Partnerships. CRM Project Manager Salary Band C

SE420 Software Quality Assurance

9. Verification, Validation, Testing

Summary of TL 9000 R4.0 Requirements Beyond ISO 9001:2000

Risk-Based Approach to SAS Program Validation

Building quality into the software from the. Keeping and. the software. software life cycle

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages

Introduction to software testing and quality process

Requirements Gathering using Object- Oriented Models

CSV Inspection Readiness through Effective Document Control. Eileen Cortes April 27, 2017

Requirements Validation and Negotiation

0 Introduction Test strategy A Test Strategy for single high-level test B Combined testing strategy for high-level tests...

Software Development Life Cycle:

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

An Application of Causal Analysis to the Software Modification Process

Computerised System Validation

References Concept. Principle. EU Annex 11 US FDA , (g), (i), 11 Orlando Lopez 2/15/11. Old Annex 11.

Introduction. Fundamental concepts in testing

PAI Inspections, Observations and Data Integrity

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

Requirements for a Successful Replatform Project

QuEST Forum. TL 9000 Quality Management System. Requirements Handbook

It will also enable you to manage the expectations of your clients or management, as they will know exactly what to expect.

SE420 Software Quality Assurance

Surviving the Top Ten Challenges of Software Testing

MSc Software Testing and Maintenance MSc Prófun og viðhald hugbúnaðar

ASQ FD&C GMP Quality Conference

Risk-Based Testing: Analysis and Strategy. Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009

Copyright Software Engineering Competence Center

Compliance driven Integrated circuit development based on ISO26262

WORKING WITH TEST DOCUMENTATION

Computerised Systems. Alfred Hunt Inspector. Wholesale Distribution Information Day, 28 th September Date Insert on Master Slide.

Functional Safety Implications for Development Infrastructures

Module 1 Study Guide

General Guidance for Developing, Documenting, Implementing, Maintaining, and Auditing an SQF Quality System. Quality Code. SQF Quality Code, Edition 8

Book Outline. Software Testing and Analysis: Process, Principles, and Techniques

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

AUTOMOTIVE SPICE v3.1 POCKET GUIDE

ISTQB CTFL BH0-10 Questions Answers with Explanation

Validation of Calibration Software by the Calibration Facility

SOFTWARE TESTING REVEALED

ISEB ISTQB Sample Paper

Certified Tester. Expert Level. Modules Overview

Managing System Performance

Protecting Information Assets - Week 9 - Business Continuity and Disaster Recovery Planning. MIS 5206 Protecting Information Assets

The ITIL v.3. Foundation Examination

Corporate Background and Experience: Financial Soundness: Project Staffing and Organization

Medidata Clinical Cloud (MCC) Validation

Digital Industries Apprenticeship: Occupational Brief. Software Tester. March 2016

Professional Engineers Using Software-Based Engineering Tools

Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications

Chapter 1. Pharmaceutical Industry Overview

IEC KHBO, Hobufonds SAFESYS ing. Alexander Dekeyser ing. Kurt Lintermans

David. Director of Rational Requirements and Quality Management Products

Introduction to Software Testing

A Guide to the Business Analysis Body of Knowledge (BABOK Guide), Version 2.0 Skillport

IT Software Testing

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

An Overview of Software Reliability

EUROPEAN COMMISSION ENTERPRISE AND INDUSTRY DIRECTORATE-GENERAL. EudraLex The Rules Governing Medicinal Products in the European Union

Roadblocks to Approving SIS Equipment by Prior Use. Joseph F. Siebert. exida. Prepared For. ISA EXPO 2006/Texas A&M Instrumentation Symposium

How To Evolve a Context-Driven Test Plan

CHAPTER 5 INFORMATION TECHNOLOGY SERVICES CONTROLS

Frameworx 11.5 Product Conformance Certification Report. WeDo Technologies RAID Version 6.3

Guidance for Industry - Computerized Systems Used in Clinical Trials

Testing. Testing is the most important component of software development that must be performed throughout the life cycle

Vector Software. Understanding Verification and Validation of software under IEC :2010 W H I T E P A P E R

Design Qualification SOP

Risk-based Approach to Part 11 and GxP Compliance

ISTQB CTFL BH0-010 Exam Practice Question Paper

Seminar 06 Chapter 5 - Part 1

Application Lifecycle Management for SAP Powered by IBM Rational

Use of PSA to Support the Safety Management of Nuclear Power Plants

Improving the Test Process with TMMi

Test Management Test Planning - Test Plan is a document that is the point of reference based on which testing is carried out within the QA team.

Validation and Automated Validation

Laboratory Data Integrity Issues Found in Audits and Inspections

EU Annex 11 US FDA 211, 820, 11; other guidelines Orlando López 11-MAY-2011

CORPORATE QUALITY MANUAL

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

TECHNOLOGY brief: Event Management. Event Management. Nancy Hinich-Gualda

Secure Integration of the PersoApp-Open-Source-Library

ISO General Requirements for Competence of Testing Laboratories Procedure

Transcription:

CONCEPT HEIDELBERG GMP Compliance for January 16-17, 2003 at Istanbul, Turkey Testing for Systems Validation Dr.-Ing. Guenter Generlich guenter@generlich.de Testing 1 Testing: Agenda Techniques Principles Levels Types Data Tools Mistakes Planning Costs Testing 2 1

Goals of Testing Show a system meets its requirements and specifications Find and eliminate defects ( not just bugs) Find differences between the actual system and its models Determine appropriate reliability of the system Decide when to release the system in a compliant state Use as little resources as possible... and for those who like an abstract definition: Find cases where the program does not do what it is supposed to do Find cases where the program does things it is not supposed to do Testing 3 Testing is Not Prove that the system is error free Prove that the system will work without errors Establish that the system performs its functions correctly Establish with confidence that the software does its job fully Testing 4 2

Goals of Testing? Testing 5 Once Again: The V-Diagram Specification and Design Build and Test Release and Operation Requirements Acceptance Testing Operative Environm t V Ongoing Evaluation Functional Specification System Testing Technical Specification Integration Testing Program Development Unit Testing Testing 6 3

Testing - Debugging - Verfication System Requirements Software Requirements Analysis Design Verification Static Testing Coding Testing Testing by Execution Debugging Production / Deployment Testing 7 Good Testing Practices (GTP) Test Procedure Test Planning Test Scheduling Test Documentation Test Execution Test Recording Deviations Calibrated Tools Testing 8 4

Test Procedure GTP1 Tests are executed according to a pre-defined and preapproved test procedure (or protocol) The test procedure is established on the basis of the requirements and/or appropriate system / equipment specifications refers to the relevant specifications should enable repetition of the test should be based on formal and named documents held under version control according to Good Documentation Practise Testing 9 Timing of Testing Tasks GTP2 Testing should not start before the test procedure has been approved All test-related dates must be logically consecutive test procedure accepted 2. May test execution accepted 15. May test evaluation accepted 1. June Testing 10 5

Test Planning GTP3 Tests should cover all relevant areas of the relevant equipment or system Tests should be planned and executed by persons qualified to the tasks they are executing, e.g. technically skilled knowing the equipment/system trained in the requirements of GTP Testers should be as independent as possible exception: acceptance tests Do not plan testing assuming that there are no errors Testing 11 Test Documentation GTP4 The test documentation should include name, position and date for authors, reviewers and approvers include the test procedure, described in sufficient detail show date and signature on each test by the tester and witness or reviewer be retained and properly archived Testing 12 6

Test Execution GTP5 There should be pre-determined acceptance criteria or statements of expected results for each test During execution test results should be - recorded directly onto the test results sheet or - refer to printouts or computer generated test execution files (e.g. screen printouts) Each test should be concluded with a statement of whether the test met its acceptance criteria Test execution should be audited on at least a sample basis by either the user representative or the supplier quality assurance function Testing 13 Test Recording GTP6 Manual test recording should use permanent ink Shorthand notations such as tick marks should be avoided Actual values be recorded where appropriate Any corrections should be crossed out with a single line, initialed and dated with a brief explanation Correction fluid should not be used Testing 14 7

Deviations GTP7 All deviations should be recorded and be traceable throughout correction and retest into final closure Deviation corrections may require regression testing to verify that the corrections did not introduce new problems in other tested areas Testing 15 Calibrated Tools GTP8 Any critical instrument inputs and any test equipment should be calibrated with documented evidence of such calibrations, traceable to international standards Calibration equipment should be certified, traceable to national standards and referenced Testing 16 8

Some Types of System Defects Software bugs Wrong algorithm Wrong functionality Missing design or functionality Interface specific: internal, external Others weak requirements missing acceptance criteria usability problems. Testing 17 Testing Techniques Black Box Testing Assess how well a program meets the requirements Assumes the requirements are accepted Checks missing or incorrect functionality Compares system result with predefined output Performance, stress, reliability, security White Box Testing Reveal problems with the internal structure of a program Requires detailed knowledge of structure Essentially path testing Structures can be tested even when structure is vague or incomplete Testing 18 9

Testing Techniques black box - functional white box - structural in out in out manual calculation acceptance testing system testing all little boxes (unit testing) and linking paths (integration testing) are individually tested Testing 19 Testing the Requirements Testing makes sense and/or becomes possible, when the requirements are... business related verifiable and testable validatable complete and correct exact and unambiguous actual and/or future directed annotated and justified internally consistent prioritized and timed necessary and importand below 100% uniquely identified traceable feasible manageable free of unwarranted design details modifiable implementation-free understood and accepted Testing 20 10

Unit Testing Unit is an individual component, a function or a small library Small enough to test thoroughly Exercises one unit in isolation of others Easier to locate and remove bugs at this level of testing Structural testing in test environment Done during code development/programming Designed, done and reviewed by programmer White Box Testing 21 Integration Testing Units are combined and module is exercised Units are tested together to check they work together Focus is on the interfaces between units Shows feasibility on modules early on Tester needs to be unbiased and independent Caution: tends to reveal specification errors rather than integration errors White box with some black box Testing 22 11

System Testing The whole system: hardware, software, periphery, documentation, incl. manual parts are tested in detail Verify the system correctly implements specified functions Testers mimic the end use Independent testers and formal approval by (another?) independent function (not developer, tester, or user) Ensures softer system features are accurately tested (performance, security, reliability, ) Black box alpha testing Testing 23 Acceptance Testing Completed system tested by end users More realistic test usage than system phase Confirms system meets business/user requirements Determine if systems is ready for deployment Performed in productive environment Black box beta testing Testing 24 12

Extra Goals for Acceptance Testing Provide confidence of reliability of (probable) correctness of detection (therefore absence) of particular faults Testing 25 Testing of Standard Operating Procedures (SOPs) Business Change Documentation Archiving Change Management Testing Error Handling Job Description Training Ongoing / Periodic Review Logical & Physical Security Start-up & Shutdown Backup & Restore Maintenance & Repairs All these can be application independent! System Support incl. Help Desk Assessments &Auditing (internal and external) Service Level Agreements (SLAs) Testing 26 13

Testing Rules Testing needs to be predefined and repeatable Complete testing not possible and not required Balance test effort and expert judgement system elements considered degree of details A program should not test its own code Thoroughly inspect the results of each test Special tests for each requirement, e.g. repeated key strokes formal approval Base test extend on risks involved Highly recommended to work in teams Testing 27 Regression Testing Rerunning test cases which a program has previously executed correctly in order to detect errors spawned by changes or corrections made during software development and maintenance FDA Glossary of Computerized System and Software Development Terminology Testing 28 14

Regression Testing Tests modified software Verify changes are correct and do not adversely affect other system components Selects existing test cases that deemed necessary to validate the modification With bug fixes, four things can happen - three of them are bad Testing 29 Types of Systems Testing (1) Facility Communications Performance Volume Load/Stress Configuration Reliability Serviceability Does the system provide all the functions required? Are all communications links working? What are throughput rates and response times under peak and normal conditions? How much data can the system process normally/ continuously? What happens when we push the system with heavy loads to its limits? a - What (extreme/limit) configuration conditions do exist? b - Does the system work on all target hardware? How reliable is the system over time with a typical workload? How maintainable is the system? Testing 30 15

Types of Systems Testing (2) Compatibility Are there areas where the system has incompatibilities? Recovery Usability Operations Environment Security Endurance Storage a - How do we handle hardware and software failures b - How well does the system recover from failure? a - Is the system consistent during use? b - Can the users use the system easily? Are the operators instructions right and can operators run the system without problems? How does the new system impact other, existing systems? Can the system withstand attacks or can we find ways to break security provisions? Will the system continue to work for long periods? Are there any unexpected data storage issues? Testing 31 Test Data Path analysis Boundary values Common error values "Impossible" values Random generation Actual data Testing 32 16

Test-Tools Computer Aided Testing Tools = CATT Test Management organization, structure and management of the various testing phases (e.g. planning, preparation, execution, consolidation and evaluation of results, error handling and rerun) Function and regression testing Load and Performance testing New: web testing (function, load, links, ) Testing 33 Don t Forget Also this requires thorough testing: User and operations procedures Data conversion software and procedures Procedures for security, back-up and recovery and all other SOP s and: Involve the users as early as possible Train the user before testing Monitor the testing process Testing can never be complete for non-trivial programs Testing 34 17

Metrics Host of metrics can be used in testing Three themes prevail: Product quality assessment: How many defects remain in the system? Risk management: What is the risk related to remaining defects? Test process improvement: How can we improve our testing results? Testing 35 Classic Testing Mistakes Misunderstanding the role of testing Poor planning of the testing effort Personnel Issues Poor testing methodology Too much technology (Mis) Use of code coverage Testing 36 18

Misunderstanding the Role of Testing Test team is responsible for assuring quality Purpose of testing is to find bugs Usability problems not considered valid bugs No focus on an estimate of quality Reporting results without context Testing 37 Poor Planning of the Testing Effort Starting testing too late Bias towards functional testing Incomplete configuration testing Delaying load and stress testing until the last minute No / incomplete testing procedures No / incomplete testing documentation Do not rely on beta testing Inflexible test plans Testing 38 19

Personnel Issues Testing as a transitional job for new hires or haven for failed programmers No domain experts Requiring testers who understand Functionality Programming Quality Testing 39 Poor Testing Methodology Paying more attention to execution than to design Overly specified test design Not exploring irrelevant oddities Acceptance criteria vague, incomplete, missing Not ensuring the product does not do what it is not supposed to do Poor bug reporting, poor error handling Failing to take notes for the next test effort Testing 40 20

Too Much Technology Trying to automate all tests Expecting regression testing to find a lot of new bugs Testing 41 (Mis) Use of Code Coverage Using coverage as a performance goal Selecting test cases based on coverage impact Abandoning coverage entirely Testing 42 21

Professional Testing Defined testing policies Test planning process Test lifecycle Test group Test process improvement group Test-related metrics Tools and equipment Controlling and tracking Product quality control Systematic Testing Structured Testing Testing 43 Test Documentation Test plan Test case specification Test incident report Test summary report scope, approach, resources, and schedule - updated when system changes inputs, drivers / stubs & expected results - updated when system changes results of each execution of each test lists all failures discovered during tests, evaluation of test results Testing 44 22

Test Plan (Example: Acceptance Test) Scope and Purpose Test Environment Assumptions, exclusions, limitations Roles and Responsibilities Test-Specifications Data Test cases Expected results Acceptance criteria Error handling Documentation see details at the end of presentation Testing 45 Test Plan Checklist Are the goals of testing for the project defined? Is the scope of testing to be performed on the project defined? (e.g. which features to test, which features not to test, localization issues, usability issues) Are the testing techniques and strategies to be used on the project outlined? Are the artifacts to be produced by testing activities defined? Are the risks related to testing and contingency plans for those risks discussed? Are any assumptions which may affect the execution of the plan discussed? Are the entities or roles & responsibilities for testing described? Are information resources upon which testing will depend outlined? (e.g. requirements specifications, design specifications) If appropriate, are physical resources related to testing described? (e.g. test lab, software utilities, staffing, schedule) Testing 46 23

Costs of a Defect Cost factors for defect elimination Requirements Design Coding Testing After Going Live 1x 3x 15x 45x 200x after Barry Boehm 1981(!!) Software Engineering Economics Testing 47 Testing 48 24