Validation of Commercial Computerised Systems using a Single Life Cycle Document (Integrated Validation Document)

Size: px
Start display at page:

Download "Validation of Commercial Computerised Systems using a Single Life Cycle Document (Integrated Validation Document)"

Transcription

1 Validation of Commercial Computerised Systems using a Single Life Cycle Document (Integrated Validation Document) R. D. McDowall McDowall Consulting, 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK Summary A risk-based approach to the validation of low risk commercially available computerised systems is described. To determine if validation is required, the business process automated by the system is assessed to see if it is regulated. If validation is required, then the Good Automated Manufacturing Practice (GAMP) software category is mapped against the impact of the records generated by the system to determine if full or reduced validation is required. If reduced validation is indicated, the use of a single integrated validation document is proposed and illustrated with two case study examples. The use of a Validation Master Plan (VMP) to facilitate this validation process is also presented and described. Copyright r 2009 John Wiley & Sons, Ltd. Key Words: Computerised System Validation; Risk Management; Integrated Validation Document; Validation of Low Risk Commercial Computerised Systems Introduction Risk assessment and risk management have had a high profile since the introduction by the Food and Drug Administration (FDA) of the Good Manufacturing Practices (GMPs) for the 21st Century [1]. These topics have been extended with the publication of ICH Q10 [2] on quality systems in the pharmaceutical industry and ICH Q9 [3] on risk management. *Correspondence to: R. D. McDowall, McDowall Consulting, 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK. rdmcdowall@btconnect.com Risk assessment has been highlighted in computerised system validation (CSV) and implementation of electronic records and electronic signatures regulations (21 CFR Part 11) with the publication of the FDA Guidance for Industry on Part 11 Scope and Application in 2003 [4]. Risk assessment will soon be incorporated in European Union (EU) regulations with the release of the draft proposal of EU GMP Annex 11 [5]. There are several risk analysis methodologies available that could be used for computerized system validation. A number of these were reviewed as to their usefulness for CSV [6]. Copyright r 2009 John Wiley & Sons, Ltd. Qual Assur J 2009; 12,

2 Integrated Validation Document 65 One of the conclusions drawn was that one risk methodology does not fit all situations for computer validation and the prudent professional should select the best methodology applicable for the problem at hand. Also this paper contained reference to a single integrated validation document for lower risk computerised systems that are available commercially [6]. The aims of this paper are to: 1. Present and discuss risk assessment and risk management to determine if using an integrated validation document is justified for validation of low risk commercial systems. 2. Discuss the role of a Validation Master Plan (VMP) in managing the validation process, in the context of the integrated validation document. 3. Discuss a system-based risk assessment methodology and linkage with the VMP inventory. 4. Present an outline of the content of an integrated validation document and demonstrate its use in two case studies where this approach has been used. The practical risk management approach outlined in this article is based upon taking advantage and leveraging what a vendor already has undertaken during development of the application and, where appropriate, the whole system (e.g. specification, design, testing, maintenance, calibration, any installation qualification (IQ) and operational qualification (OQ) tasks and product support). The vendor s work should be coupled with the activities that users perform during the operational lifetime (e.g. routine performance checks, in-house calibration/qualification coupled with routine and preventative maintenance). The objective is to provide a defendable answer to the question: are you in control? Whilst computer validation is one mechanism for control, calibration, instrument/equipment qualification and maintenance are others. It is the combination of these that will provide the overall control for systems validated using the integrated validation document. This risk-based approach to validation of computerised systems was developed in GMP and Good Laboratory Practice (GLP) environments. The principles outlined in this paper have not yet been applied to a Good Clinical Practice (GCP) environment. GAMP Version 5 Software Categories and Risk Management The Good Automated Manufacturing Practice (GAMP) guidelines version 5 [7] have refined the definitions of software in the latest version released in early 2008; appendix M4 of the guide discusses the classification of software as follows: Category 3 Non-Configured Products: This category, renamed from GAMP version 4 [8], includes off the shelf products that either cannot be configured to automate a business process or an application that is used out of the box with the default configuration. There is still some configuration to operate in a specific business environment (run-time configuration), but the basic operation of the software does not change. Category 4 Configured Products: Configured software provides a means for users to modify the application function to fit a specific business process. The options to do this can range from simple configuration (selection of options), graphical interfaces through a macro language. Although where the latter mode of configuration is used, GAMP recommends that these modules be handled as Category 5 software. Category 5 Custom Applications: This software is typically unique and developed to meet the specific needs of an organisation. As such, the risk associated with custom software is high, as the organisation is responsible for the whole of the life cycle. Appendix M4 suggests that validation efforts for these three categories of software

3 66 R.D. McDowall should be focussed as follows: the greatest effort should be for category 5 software (custom), then category 4 software (configured) and lastly category 3 software (non-configured). This practical advice is simple risk management based on the nature of the software being used to automate a business process [7]. Note that these software categories are not self-contained boxes where each software application can be easily classified. These categories are indicative of a continuum of software and judgement needs to be applied when deciding where to put an application [7]. The use of the software and the way it is or is not configured could mean that the same software in one organisation is category 3 and in another is category 4. Category 3 Software In previous versions of the GAMP guide [8], category 3 was entitled standard software, in version 5 it has been re-named non-configured software [7]. In some respects this could be a misnomer as there is normally some configuration, but software category 3 should have the following four characteristics: 1. Commercially available product. 2. The software works out of the box. There is no configuration to change how the software works, i.e. the application functions cannot be changed to meet the way a business process works. 3. The software can still be configured to enable it to operate in a specific environment (run time configuration see the discussion below). 4. Supported and maintained by a vendor. Run time configuration is the key concept that needs to be discussed and defined further as it distinguishes category 3 from category 4 software (items 2 and 3 above). Upon installation of the category 3 application, the software is capable of operating without any modification. Run time configuration is the definition of items in the software to enable the system to operate within the installed environment. Some typical run time configuration parameters are the definition of users and user types for authorised individuals, entry of the department or company name into report headers, selection of units to present or report data, default data storage location (either a local or network directory) and the default printers. What characterises these run time parameters is that they do not impact the operation of the software. This is opposed to the tools that typify category 4 software where the actual operation of the software can be changed, e.g. implementation or electronic signatures or the use of the system as a hybrid. This means in practice that category 3 software is a relatively low risk application. Furthermore, in section 4 of the main body of GAMP 5, there is a simplified life cycle model that can be applied to this software category that consists of just three phases: define user requirements, install the application with configuration followed by verification against the user requirements [7]. Verification is used in GAMP 5 to cover both testing or to check that a requirement has been implemented. However, in this paper the author will split verification into: Testing (e.g. user acceptance tests against documented requirements), and Verification of a requirement during an activity or presence of a document (e.g. installation qualification, writing a procedure, training or application configuration). This paper will describe an approach to validation of low-risk systems that incorporates the GAMP 5 principles detailed above but achieves it in a single integrated document. The aim of this is to reduce the time and effort needed to validate lower risk commercial systems and to enable their release to operational use faster than with conventional validation approaches. This would allow reallocation of resources to higher risk systems where more time could be spent on validation.

4 Integrated Validation Document 67 System Risk Assessment: Determining the Need and Extent of Validation Before starting a discussion of the integrated validation document, it is essential to describe the risk assessment of each system to define the extent of validation required. This needs to be a structured and documented process. In essence the approach should be based around asking and answering two questions: Do I need to validate the system? If I do need to validate, how much validation is required? The overall process is shown in Figure 1 and shows how the two questions are linked. System Risk Assessment: Do I Need To Validate? The first stage of this process is to define what the system does. What is its intended purpose and what is business process to be automated? This is important because if this is not done correctly, the rest of the process and the system will not be assessed properly. When starting this assessment, either prospectively or retrospectively, you should need an understanding of how the system will/does work and what records will be/are created by a system. The intended use of the system must be documented and approved by the system owner and quality assurance (QA). The questions to ask to determine if a system needs to be validated have been derived from a Do I Need to Validate? Decision Criteria No Yes Extent of Validation? High Risk Decision Criteria Low Risk Full Validation Reduced Validation Figure 1. Process flow covering system risk assessment and determination of the extent of validation

5 68 R.D. McDowall checklist produced by the Computer Validation Initiative Committee (CVIC) of the Society of Quality Assurance (SQA) [9]. Although this checklist was produced circa , the questions are still valid today and some of these are presented in Table 1. The questions are closed and therefore only allow a YES or NO response. If all responses to the questions are NO, then validation of the system is not required and the system can be documented as such. This will be added to the system inventory within the Validation Master Plan (VMP) as one not requiring validation. However, the users should follow company procedures for installing and testing applications which demonstrates control but may not have the full documentation or testing or even any QA reviews. However, it is important to understand that assessments do not remain static as reorganizations and mergers as well as new software releases can all affect the way that a system is used after an initial assessment has been made. Therefore, the prudent organisation will always recheck the assessments at key stages of a system s life cycle. However, if the answer to any of the questions is YES, then the system needs to be validated and we can move to the second stage of the risk assessment. What is the Extent of Validation Activities? This is always a difficult question, as the answer can always be prefaced it depends. However, if the regulations are the key sources to finding out what is needed to be done and to help in defining the next stage of the system risk assessment: European Union GMP Annex 11 [10] states in clause 2: The extent of validation necessary will depend on a number of factors including the use to which the system is to be put, whether the validation is to be prospective or retrospective and whether or not novel elements are incorporated. ICH Q7 GMP for Active Pharmaceutical Ingredients [11] states: y5.40: Computerized systems should be validated. The depth and scope of validation depends upon the diversity, complexity, and criticality of the computerized application. y5.42: Commercially available software that has been qualified does not require the same level of testing. FDA General Principles of Software Validation [12] states in Section 6.1: How much validation is needed? The extent of validation evidence needed for such software depends upon the device manufacturer s documented intended use of that software. For example, a device manufacturer who chooses not to use all of the vendor supplied Table 1. Questions to determine if a system should be validated or not [9] Is the system involved in: Manufacture, storage, distribution, return, salvage or re-processing of drug product? Manufacture or storage of Active Pharmaceutical Ingredients (APIs)? Testing of drug product or API for formal release? Drug packaging or labeling? Drug shipment? Non-clinical laboratory studies intended for submission to or review by regulatory authorities? Clinical investigations or studies? Generation of, submissions to, or withdrawal of an Investigational New Drug Application (IND)? Generation of, submissions to, or withdrawal of a New Drug Application (NDA)? Training records or qualifications of personnel involved in the manufacture of drug product or API, or in the conduct of non-clinical or clinical laboratory studies? Is the system used to backup or store records supporting any of the above, in electronic format? Is the system used for the transfer of electronic records supporting any of the above from one GxP system to another?

6 Integrated Validation Document 69 capabilities of the software only needs to validate those functions that will be used and for which the device manufacturer is dependent upon the software results as part of production or the quality system. From the statements in the regulations and guidance, the main factors that determine the extent of validation are: Identify the Intended Use of the System: This is based upon knowledge of what the system will automate or, if a retrospective validation, what it currently automated. However, during its operation, the system will also create and manage records as evidence of its activity. It is these records that need to be identified and their impact determined as part of this assessment as we shall discuss below. Criticality of Records Generated by the System: Identify the impact of the records generated by the system according to the GAMP Good Practice guide for Part 11 compliant electronic records and signatures [13]. Nature of the Software Used to Generate the Records: Here we need to identify software by the GAMP Software categories. This will be typically software category 3, 4 or 5 [7] that were discussed in the last section of this paper. Computer validation is undergoing a complete revision of the approach. Traditionally, there has been a systems approach (top down) which looked at the software and the process that was automated to define the extent of validation. Publication of the FDA Part 11 Guidance [4] and the GAMP Good Practice Guide on a Risk-Based Approach for Compliant Electronic Records and Signatures [13] moves validation to a records-based approach (bottom up). Therefore, the impact of the record generated by a system determines the extent of the validation activities. The impact of each record produced by a computerised system is assigned to one of three categories [13] high impact, medium impact and low impact and will be used in the system risk assessment that will determine the extent of validation for a specific system. High impact records are defined as those, which have a direct impact on product quality or patient safety. Examples of high impact records can be stability data, batch release of drug product, adverse event reporting or records that are submitted directly to FDA or are included in regulatory submission (e.g. IND/Clinical Trial Exemption (CTX) or NDA/Product Licence Application (PLA)) [13]. Medium impact records are defined as having an indirect impact on product quality or patient safety: Typically these are used as supporting compliance evidence such validation records, training records and other material not directly submitted to FDA [13]. Low impact records are defined as having negligible impact on product quality or patient safety and are typified by supporting data not but not direct evidence of compliance activities. An example could be a schedule of calibration, qualification or validation activities and dates these were carried out [13]. As befits a regulated industry, there are more high impact records than medium and low ones combined. Note that when carrying out these assessments, in some cases high impact records (for example, electronic signatures [13]) can be associated with medium impact ones (e.g. validation records or electronic SOPs which have been generated electronically and signed electronically). During the review of this paper, an alternative classification of records produced in the pharmaceutical industry was identified. The CSV risk management paper by Siconolfi and Bishop [14] identifies and lists high, moderate and low risk records, there are some differences in classification between this approach and GAMP [13] e.g. electronic signatures are high in GAMP regardless of the record signed but

7 70 R.D. McDowall depend on the record being generated by the system in the approach outlined by Siconolfi and Bishop, similarly training records are medium and low respectively. These differences simply highlight that it is important for any company to develop their documented and justified approach to identifying the records created and their impact. Therefore, for the system risk assessment it is the combination of the impact records created by the system coupled with the software that created the records that should be used to determine the overall validation approach and extent of work. Based upon this conclusion, we need to develop a simple tool that takes into consideration both of these aspects: the combination of the impact of the records generated by the system and the nature software used to generate them. This is achieved by plotting the three levels of record impacts versus the three software GAMP categories as shown in Figure 2. Full and Reduced Validation Approaches In defining the extent of validation work necessary to meet the risk assessment there is a simple yardstick that already exists. This is a validation of a computerised system based on a V-model [7]. This will be described as a full validation in this paper. Although the extent of the activities and documented evidence will be written in the validation plan for an individual system and the content would usually be dependent on the nature of the system in question: category 3, 4 or 5 [7] and implicitly the business process audited. At the other end of the validation scale is a simple or reduced validation for low risk systems. The GAMP guide [7] outlines a three stage V-model for category 3 software. However, this can be condensed into a single integrated validation document where all of the essential elements of a validation would be found but within a single document rather than several ones. Thus we now have a reduced and full validation approaches that reflect two extremes. However, in devising this approach to validation there was a conceptual problem would also a middle road be advisable? After some consideration, the author decided that just two approaches were desirable as the full validation approach could be varied in extent and depth as defined in a validation plan for a system, as outlined above. The two approaches are defined now and are shown as Full and Reduced in Figure 2. Full Validation: apply a V-model validation as described in the GAMP guide version 5 [7]. The validation plan for an individual project 5 Full Full Full GAMP Software Category 4 Reduced Or Full Full Full 3 Reduced Reduced Reduced Low Medium High Record Impact Figure 2. Decision matrix to determine if full or reduced validation is required

8 Integrated Validation Document 71 should determine the complexity of the life cycle to be followed and the extent of documented evidence required to support the validation of the system. Thus further risk management can be applied to an individual validation project. Reduced Validation: use a single integrated validation document to record the intended purpose of the system and the associated testing and verification activities [6]. As shown in Figure 2, the reduced validation approach can be applied to any GAMP software category 3 system that generates records regardless of their impact. It could also be applied to some category 4 systems that are used to generate low impact records; however, this approach would need to be justified on a caseby-case basis and would not apply to all category 4 software. For example, a category 4 application with no or very simple configuration could use the integrated validation document. However, for category 4 software requiring more extensive configuration a conventional V-model is more appropriate. Control via a Validation Master Plan (VMP) Before a detailed discussion of the integrated validation document and its application, a key aspect of computer validation that needs to be addressed in this paper is the subject of the control of the work. In a full validation, control is provided by a validation plan that is specific to the project that details the tasks and the documented evidence to be produced. At the end of the validation a summary report presents the work done, the deviations to the plan and the actual evidence produced to support the validation. This approach is not used with the integrated validation document. Control for the integrated validation document is provided via a Validation Master Plan (VMP) that is either written specifically for computerised system validation or a section of a more general VMP. This may be supplemented, if required, by an SOP that describes the risk assessment and also the use of the integrated document in more detail. The VMP is written using the format described in the PIC/S guidance [15] rather than EU GMP Annex 15 [16] as the former reference contains more description of a VMP than the latter. WhyUseaVMP? A VMP provides the elements of validation control and a framework for both the risk assessment and an inventory of computerised systems and their status. This is necessary as the integrated validation document does not have these elements of control due to the reduction of documents to a single instance. The background of a VMP has been discussed by the author in a recent publication [17] and will not be repeated here. However, some discussion of the role of a VMP is necessary and in the context of the integrated validation document specifically. Computerized system validation and any associated instrument qualification or calibration involves multidisciplinary tasks involving various professions and skill sets. Therefore, a VMP is a way to organize and manage these across over several departments or within a facility to control and manage the activities. A VMP can present a company s approach to validation, and via the inventory the systems under its remit, their current validation status, and the timescales for any planned or ongoing work. To quote the PIC/S guide [15]: The VMP should present an overview of the entire validation operation, its organizational structure, its content and planning. The core of the VMP being the list-inventory of the items to be validated and the planning schedule. In writing a VMP you will help management, QA, users, any validation personnel, and inspectors. What you need to do is define the scope of the document: What is in and what is out?

9 72 R.D. McDowall Defining the VMP Scope This is where you need to be smart but also careful when interpreting the PIC/S document for computerized system validation. The guidance mentions manufacturing and process equipment, so you ll need to define the scope with the slant to the computerised systems. Although with such a flexible document, there is nothing to stop you having a laboratory section within it or indeed a VMP for the laboratory. So the scope of the document will be defined by a number of factors: Physical boundaries: site, division, building, department, or laboratories. Indeed, a VMP for some activities such as 21 CFR Part 11 can go across sites. Driver for the plan: This can be a manufacturing process, site computerized systems, or laboratory systems (not forgetting the army of spreadsheets that will be hidden around the network and local drives of the organisation). IT infrastructure qualification [18]: This can be included or excluded depending upon the size of the operation. Typically, this is excluded in larger organizations, as it is the responsibility of corporate IT. However, in smaller organizations it can be included within the scope of a VMP. My inclination is to exclude this and have IT take care of the work; however, this must be documented in the exclusions section of the document. Defining the qualification, validation, and calibration activities that will contribute to the overall control over computerised systems. All activities covering all prospective and retrospective validation tasks and any IT qualification activities of IT systems within scope. Just as important as defining what is under the scope of the VMP is stating what is excluded from the scope of the document. Therefore, there must be a section stating what is excluded from the plan together with a rationale: Systems with no regulatory impact (excluded from validation but listed in the appropriate section of the inventory). IT infrastructure (if under separate responsibility of the IT group). Furthermore, there could be in-house IT under a separate VMP and out-sourced IT or hosted applications excluded from the scope as these come under the terms of a separate Service Level Agreement (SLA) with the external companies. A VMP is also a formal and concise document that must be reviewed and released by management and QA. The individuals who should sign the document will depend upon the scope of the VMP. For example, a VMP for a site may be authorised by the site head, in contrast with a VMP for a department could be released by the department head. The VMP Inventory: Setup and Maintenance The heart of the VMP is the systems inventory. Here the inventory needs to be linked intimately with the process flow in Figure 1. When a system is assessed as having no need for validation it should be entered into the VMP inventory section that lists systems that are not to be validated. As the assessment progresses there are high and low risk systems corresponding to full and reduced validation respectively. In addition to this there needs to be more information appended to each system: Who is responsible for the system (identify the system owner)? Where is it located (specific room location or department name(s))? What does it do (this ideally is a short description of the function of the system that can be copied from the system risk assessment)? What is the validation status (being implemented, validated, upgrading, and so forth)?

10 Integrated Validation Document 73 More information can be added, depending upon the organization but these are the core requirements for each system in the inventory. As each validation is completed, the inventory status needs to be changed to show the change from being implemented to validated and operational. Integrated Validation Document Many of the GAMP software version 3 systems are typically stand-alone and may only have a few users. To undertaken a validation following the V-model outlined in GAMP version 4 [8] is too onerous and unrealistic. To ensure overall control of the system, the integrated validation is coupled closely with other control mechanisms such as the performance checks carried out before starting any normal work, calibration, qualification and/or maintenance. Document Structure The integrated validation document has the following overall structure that is outlined in Table 2 and consists of the main features expected in any computer system validation: System description. Definition of user requirements. Listing of calibration, qualification or maintenance activities that contribute to system control. Traceability matrix. Installation of the software and other components as necessary. Test preparation including any assumptions, exclusions and limitations of the testing. Testing of the system versus the requirements. Collation of documented evidence from the testing. Documentation and resolution of any test incidents. Summary report and operational release statement. Similar with all validation projects, the understanding of how a system works and is used by the business is the key to success. As with any controlled document, the integrated validation document must be reviewed and approved before any testing is undertaken. Post execution, there is an independent review to check the testing phase has been carried out correctly and the conclusions reached by the tester are correct. The document then undergoes a QA review before final sign-off and release of the validated system for operational use. Intended Use Requirements Typically such a document is about pages long and focuses on the intended use of a system and demonstrating these have been fulfilled by the system. This is a literal interpretation of the US GMP regulations y [19] covering equipment design, size and location and which state: Equipment used in the manufacture, processing, packing, or holding of a drug product shall be of appropriate design, adequate size, and suitably located to facilitate operations for its intended use and for its cleaning and maintenance The document therefore just contains intended use requirements to define what the organisation will use the system for; there are no additional requirements and it is intentionally a focussed section. Where appropriate, there will be one or more requirements to define the adequate size of the system, but these need to be documented carefully as it depends on the nature of the system being validated. For example in the first case study of the Time of Flight mass spectrometer the adequate size was defined as 5 samples as this was the maximum number of samples analysed at any one time. In contrast, the temperature logger system did not have adequate size defined as the operation of the software would only allow the set up or data download of one data logger at a time.

11 74 R.D. McDowall Table 2. Structure of the integrated validation document Section Introduction System Description Referenced Documents Intended Use Requirements & Traceability Matrix Test Preparation & Testing Assumptions, Exclusions and Limitations Test Personnel Installation Qualification User Acceptance Testing Test Execution Notes Test Summary Log Proforma Report and Release Statement Content Identification of the system that is being validated. Linkage to the validation master plan and inventory. This consists of a brief overview of the system and its main components with a link to a log book in the referenced documents section where the initial components of the system and any updates are maintained. A section to list any of the documents used to support the validation effort including vendor qualification materials. Only essential requirements for the system are documented in this section and focussed on the intended purpose of the system, security, and electronic record integrity, preservation, and protection. Also included any parameters that need to be configured. Each requirement is individually numbered which allows traceability to where it is tested, verified or excluded in the validation effort. Whilst a traceability matrix is not a separate document, the principles are included within the integrated validation document as shown in Table 3. This section documents any preparation necessary to carry out the testing phase of the validation. Documentation of any issues with the way testing was designed and any limitations that this may have. Which functions in the software have been excluded from testing with a rationale for each. Any individuals involved in executing or reviewing any portion of the testing phase must sign the document here. The printed name of the individual plus their initials and signature are logged before any work is undertaken. If not performed by the supplier, then using information from the user manual a simple IQ procedure for the software can be documented and undertaken in this section. This will include checking the software has been correctly installed, any default administrator accounts have been either renamed or disabled and that the software boots up acceptably. If required the run-time configuration occurs here as well after successful installation of the software. Defines the overall test procedures for the system. Within each test procedure there are: Predefined test steps. Predefined expected results. A test log to record observed results and identify the tester. Link to test execution notes to document and resolve any problems. Collation of documented evidence (in both paper and electronic format). The acceptance criteria for the test procedure. This is a log of any problems encountered during the execution of the test phase. This may just be a simple issue that can be resolved by the tester and confirmed by the reviewer or many need to be resolved by the system owner and quality assurance. Regardless of the severity the issue is documented and can be tracked until resolved. A table records the overall result of each test procedure (e.g. pass or fail). It is a simple mechanism to document the whole testing effort in a single table and enable an auditor or inspector to see a snapshot of the testing. A simple statement at the end of the document allows the tester to state if the system has passed and is released for operational use or not. A reviewer will countersign the release statement. It will be down to company policy if QA will be needed to sign the completed document. For all sections of the executed document together with any documented evidence collected, a second person will act as a reviewer to check and countersign that they agree with the testing and release of the system for operational use.

12 Integrated Validation Document 75 Another factor influencing the size of the document is if installation of the software needed to be included in the document and if default accounts need to be disabled or deleted as was the case with the second case study example. Linking Requirements with their Testing or Verification Traceability of user requirements is a US regulatory expectation [12] but if the proposed revision of EU GMP Annex 11 is verified then this will become a regulatory expectation [5]. Regardless of the regulatory requirements traceability is a very useful tool in the validation armoury as it enables anybody to see where a user requirement has been either tested or verified. Testing is typically undertaken in the user acceptance testing phase or performance qualification of the life cycle but verification of a requirement can be in any part of the life cycle (Installation of components, writing SOPs, service level agreements, etc) as shown in Figure 3. A simplified traceability matrix has been incorporated into the intended use requirements section of the integrated validation document (as there are relatively few requirements in this section) and this is presented in Table 3. This is a section from the second case study example and looks at the data loggers used by the system, requirements are written using a table format. Each requirement is individually numbered and is stated as shown in the first and second columns in the Table 3. Word auto-numbering is used for the requirements to allow ease of use. The third column of Table 3 is the traceability matrix. Although contained in a single integrated document, the principles have been taken and adapted and each requirement is traced to where it will either be tested in the document or verified either within the document or outside of it. User Requirements Specification Traceability Matrix Vendor Audit Requirements Verified IT Service Level Agreement Requirements Tested SOP System Log Book User Acceptance or PQ Test Plan IT IQ/OQ Calibration Vendor IQ or OQ Configure Software Test Script 1 Test Script 2 Test Script 3 Test Script n Figure 3. Traceability of user requirements to later life cycle phases

13 76 R.D. McDowall Table 3. An example of intended use requirements with traceability for TempTales version 4.2 Req No Function Test Proc 12. Battery powered TempTale 4 data loggers are designed to record the temperature 8.2 in the range 21C to81c for 72 hours during bulk product shipment. 13. Data loggers will be batch calibrated by the vendor, documented by a calibration certificate. C 14. Each data logger will only be used once for a single delivery of bulk product and not reused. C 15. The accuracy of measurement will be 0.551C with a resolution of 0.11C. C 16. Sampling time interval will be 5 minutes to be set by the TTMD software at data logger 8.2 activation. 17. The data logger memory collects up to 1920 data points over 72 hours After configuration by the TempTale Manager Desktop (TTMD) software, a data logger will 8.2 be started by pressing the green GO button for three seconds. 19. Activated data loggers will be equilibrated at a refrigerated temperature for a minimum of minutes prior to use. 20. The data logger will be stopped by pressing the red STOP button for three seconds. 8.2 Notes to Table 3: This section of the user requirements section refers to the data logger used with the TempTales Manager Desktop (TTMD) version 4.2 software that is discussed in more detail in case study 2 example. The traceability matrix is a simple example as there are only a few requirements of intended use within the integrated validation document. There are two traceability references shown in the Table 3. area and one is from supply chain management, the latter example will be presented in more detail. The first is C against requirements inclusive. This refers to traceability to the calibration certificate provided by the vendor with each batch of data loggers. The remaining requirements will be tested in the test procedure contained in section 8.2 of the document which covers the shipping of material from a primary to secondary manufacturing site as outlined in the next section of this paper. As shown in Figure 3, traceability can be to other phases of the life cycle, writing Standard Operating Procedures or installation of components. One element of traceability shown in Figure 3 that is not used is a vendor audit. As these systems are relatively low-risk they will not have a vendor audit performed. Case Study Examples Two examples of computerised systems will be discussed that have been assessed and validated using this approach, one is from the laboratory Time of Flight (TOF) Mass Spectrometer The single user instrument is used for elemental analysis (high resolution for molecular formula confirmation) on APIs for IND and NDA regulatory submissions, impurity identification to support process chemistry and support to discovery chemistry for the identification of unknown and known compounds. Therefore, the records generated by the system are high impact. However the software used is commercial off the shelf (GAMP category 3). Therefore, the overall risk assessment of the system is low as shown in Figure 2 and this would be validated using an integrated document. Furthermore, the software was qualified by the vendor upon installation, there are regular pre-analysis calibration checks carried out before using the system and the system is regularly maintained by the vendor. The assessment of a system that submits records to a regulatory agency as a low risk might seem strange, but why not? Look at the work that

14 Integrated Validation Document 77 the instrument does and the overall control elements in addition to validation. It is the holistic approach that integrates computer validation, instrument system calibration and the overall maintenance program that provides the control and risk mitigation for many laboratory systems. Temperature Logging for Supply Chain Management TempTales Desktop Manager (TTDM) is a de facto standard solution for monitoring the temperature of shipments between sites that is used widely within supply chain management across many industries. The intended use of this system was to monitor the shipment of bulk vaccine between the primary and secondary manufacturing sites. The shipment needed to comply with the requirements contained in two general chapters from USP (2007) [20]: Chapter 1118 Monitoring Devices Time, Temperature and Humidity (see section on use of electronic time-temperature history recorders). Chapter 1150 Pharmaceutical Stability (calculation of mean kinetic temperature). The system risk assessment determined that the software category 3 system should be validated using the integrated validation document. There were 44 requirements for the system that were verified or tested with the following test procedures: Installation of the application. Creation of a new system administrator account and disabling the default one. Access control. Initiating a data monitor. Downloading a data monitor. Data processing including the verification of mean kinetic temperature calculation. Data storage and security including the integrity of the files generated by the data loggers. The work described here was completed within 30 calendar days to meet a production deadline of the organisation and took approximately 20 man-days to undertake the work, illustrating the effective use of the integrated validation document. The size of the document was 45 pages plus documented evidence generated by the test procedures. Summary An easy-to-understand system level risk assessment process for computerized systems is described that documents the intended use and the impact of the records generated by a system. This allows the use of an integrated validation document for the validation of simpler commercial systems which is justified and described in detail. The key benefits of using the integrated validation document is simplicity and speed of the validation in many cases the work can be completed within four to six weeks. This allows validation resources to be focussed on more critical projects where more work is required to mange risk effectively and to ensure the correct operation of systems. Acknowledgements I wish to thank the following: Chris Burgess, Director, Burgess Analytical Consultancy Limited, for review and comments of the manuscript. Jürgen Dietrich, Head of Proprietary Information Services, Bayer HealthCare AG, Berlin for constructive comments that helped to develop the decision matrix further for the integrated validation document. Virginia Picot, Principal, Goldoak Scientific Limited, for contribution and collaboration in the development of the risk assessment process and the integrated validation document. References 1. GMPs for the 21st Century, FDA, International Conference on Harmonization, Q10 Pharmaceutical Quality Systems, step 4, 2008.

15 78 R.D. McDowall 3. International Conference on Harmonization. Q9 Quality Risk Management, step 4, FDA Guidance for Industry, Part 11 Scope and Application, Proposed revision to EU GMP Annex 11, McDowall RD. Qual. Assur. J. 2005;9: Good Automated Manufacturing Practice Guide, Version 5, ISPE, Tampa, Florida, Good Automated Manufacturing Practice Guide, Version 4, ISPE, Tampa, Florida, Computer Validation Initiative Committee, Society of Quality Assurance, European Union GMP, Annex 11, International Conference on Harmonization, Q7 Good Manufacturing Practice for Active Pharmaceutical Ingredients, FDA Guidance for Industry, General Principles of Software Validation, GAMP Good Practice Guide, Compliant Part 11 Electronic Records and Signatures, ISPE, Tampa FL, Siconolfi RM, Bishop S. Drug Inform. J. 2007;41: Pharmaceutical Inspection Convention/Pharmaceutical Inspection Cooperation Scheme guidance document PI-006, Recommendations on Validation Master Plan, Installation and Operational Qualification, Non-Sterile Process Validation and Cleaning Validation, European Union GMP, Annex 15, McDowall RD. Spectroscopy 2008;23(7): GAMP Good Practice Guide: IT Infrastructure Compliance and Control, ISPE, Tampa FL, CFR 211, Current Good Manufacturing Practice Regulations for Finished Pharmaceutical Products, United States Pharmacopoeia, US Pharmacopoeia Inc, Rockville, MD, 2007.