UK Standards for Microbiology Investigations

Size: px
Start display at page:

Download "UK Standards for Microbiology Investigations"

Transcription

1 UK Standards for Microbiology Investigations Quality Assurance in the Diagnostic Virology and Serology Laboratory Issued by the Standards Unit, Microbiology Services, PHE Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 1 of 28 Crown copyright 2013

2 Acknowledgments Quality Assurance in the Diagnostic Virology and Serology Laboratory UK Standards for Microbiology Investigations (SMIs) are developed under the auspices of Public Health England (PHE) working in partnership with the National Health Service (NHS), Public Health Wales and with the professional organisations whose logos are displayed below and listed on the website SMIs are developed, reviewed and revised by various working groups which are overseen by a steering committee (see The contributions of many individuals in clinical, specialist and reference laboratories who have provided information and comments during the development of this document are acknowledged. We are grateful to the Medical Editors for editing the medical content. For further information please contact us at: Standards Unit Microbiology Services Public Health England 61 Colindale Avenue London NW9 5EQ standards@phe.gov.uk Website: UK Standards for Microbiology Investigations are produced in association with: Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 2 of 28

3 Contents ACKNOWLEDGMENTS... 2 AMENDMENT TABLE... 4 UK STANDARDS FOR MICROBIOLOGY INVESTIGATIONS: SCOPE AND PURPOSE... 5 SCOPE OF DOCUMENT... 8 INTRODUCTION EXTERNAL QUALITY ASSURANCE INTERNAL QUALITY ASSURANCE INTERNAL QUALITY CONTROL EQUIPMENT MONITORING AUDIT APPENDIX 1: DOCUMENTATION USED IN IQA APPENDIX 2: DOCUMENTATION USED IN IQC APPENDIX 3: EQUIPMENT CONTROL SHEETS APPENDIX 4: DOCUMENTATION AND DATA HANDLING ASSOCIATED WITH AUDIT OF SPECIMEN TURNAROUND TIME APPENDIX 5: STATISTICS USED WITH IQC REFERENCES Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 3 of 28

4 Amendment Table Each SMI method has an individual record of amendments. The current amendments are listed on this page. The amendment history is available from New or revised documents should be controlled within the laboratory in accordance with the local quality management system. Amendment No/Date. 8/ Issue no. discarded. 6.1 Insert Issue no. 6.2 Section(s) involved Whole document. Amendment Document has been transferred to a new template to reflect the Health Protection Agency s transition to Public Health England. Front page has been redesigned. Status page has been renamed as Scope and Purpose and updated as appropriate. Professional body logos have been reviewed and updated. Minor textural changes throughout. Scientific content remains unchanged. Amendment No/Date. 7/ Issue no. discarded. 6 Insert Issue no. 6.1 Section(s) involved Whole document. Glossary of Terms and Relevant NSM sections. References. Amendment Q 2 formerly QSOP 27. Document presented in a new format. Removed. Some references updated. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 4 of 28

5 UK Standards for Microbiology Investigations # : Scope and Purpose Users of SMIs SMIs are primarily intended as a general resource for practising professionals operating in the field of laboratory medicine and infection specialties in the UK. SMIs provide clinicians with information about the available test repertoire and the standard of laboratory services they should expect for the investigation of infection in their patients, as well as providing information that aids the electronic ordering of appropriate tests. SMIs provide commissioners of healthcare services with the appropriateness and standard of microbiology investigations they should be seeking as part of the clinical and public health care package for their population. Background to SMIs SMIs comprise a collection of recommended algorithms and procedures covering all stages of the investigative process in microbiology from the pre-analytical (clinical syndrome) stage to the analytical (laboratory testing) and post analytical (result interpretation and reporting) stages. Syndromic algorithms are supported by more detailed documents containing advice on the investigation of specific diseases and infections. Guidance notes cover the clinical background, differential diagnosis, and appropriate investigation of particular clinical conditions. Quality guidance notes describe laboratory processes which underpin quality, for example assay validation. Standardisation of the diagnostic process through the application of SMIs helps to assure the equivalence of investigation strategies in different laboratories across the UK and is essential for public health surveillance, research and development activities. Equal Partnership Working SMIs are developed in equal partnership with PHE, NHS, Royal College of Pathologists and professional societies. The list of participating societies may be found at Inclusion of a logo in an SMI indicates participation of the society in equal partnership and support for the objectives and process of preparing SMIs. Nominees of professional societies are members of the Steering Committee and Working Groups which develop SMIs. The views of nominees cannot be rigorously representative of the members of their nominating organisations nor the corporate views of their organisations. Nominees act as a conduit for two way reporting and dialogue. Representative views are sought through the consultation process. SMIs are developed, reviewed and updated through a wide consultation process. # Microbiology is used as a generic term to include the two GMC-recognised specialties of Medical Microbiology (which includes Bacteriology, Mycology and Parasitology) and Medical Virology. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 5 of 28

6 Quality Assurance NICE has accredited the process used by the SMI Working Groups to produce SMIs. The accreditation is applicable to all guidance produced since October The process for the development of SMIs is certified to ISO 9001:2008. SMIs represent a good standard of practice to which all clinical and public health microbiology laboratories in the UK are expected to work. SMIs are NICE accredited and represent neither minimum standards of practice nor the highest level of complex laboratory investigation possible. In using SMIs, laboratories should take account of local requirements and undertake additional investigations where appropriate. SMIs help laboratories to meet accreditation requirements by promoting high quality practices which are auditable. SMIs also provide a reference point for method development. The performance of SMIs depends on competent staff and appropriate quality reagents and equipment. Laboratories should ensure that all commercial and in-house tests have been validated and shown to be fit for purpose. Laboratories should participate in external quality assessment schemes and undertake relevant internal quality control procedures. Patient and Public Involvement The SMI Working Groups are committed to patient and public involvement in the development of SMIs. By involving the public, health professionals, scientists and voluntary organisations the resulting SMI will be robust and meet the needs of the user. An opportunity is given to members of the public to contribute to consultations through our open access website. Information Governance and Equality PHE is a Caldicott compliant organisation. It seeks to take every possible precaution to prevent unauthorised disclosure of patient details and to ensure that patient-related records are kept under secure conditions. The development of SMIs are subject to PHE Equality objectives The SMI Working Groups are committed to achieving the equality objectives by effective consultation with members of the public, partners, stakeholders and specialist interest groups. Legal Statement Whilst every care has been taken in the preparation of SMIs, PHE and any supporting organisation, shall, to the greatest extent possible under any applicable law, exclude liability for all losses, costs, claims, damages or expenses arising out of or connected with the use of an SMI or any information contained therein. If alterations are made to an SMI, it must be made clear where and by whom such changes have been made. The evidence base and microbial taxonomy for the SMI is as complete as possible at the time of issue. Any omissions and new material will be considered at the next review. These standards can only be superseded by revisions of the standard, legislative action, or by NICE accredited guidance. SMIs are Crown copyright which should be acknowledged where appropriate. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 6 of 28

7 Suggested Citation for this Document Public Health England. (2013). Quality Assurance in the Diagnostic Virology and Serology Laboratory. UK Standards for Microbiology Investigations. Q 2 Issue Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 7 of 28

8 Scope of Document Quality Assurance in the Diagnostic Virology and Serology Laboratory This SMI describes aspects of quality assurance in the diagnostic virology and serology laboratory. Quality assurance is the term used to describe procedures used to monitor the performance of all aspects of work in the laboratory; these include external and internal quality assessment, internal quality assurance, monitoring of equipment and auditing. This SMI should be used in conjunction with other SMIs. Introduction ISO 9000:2005 defines quality as, the degree to which a set of inherent characteristics fulfils requirements. In microbiology a quality product or service can be defined as the right result on the right specimen from the right patient that is accurate, timely and properly interpreted. The objective of any test laboratory should therefore be to produce cost effective, accurate, reproducible and timely results which are comparable with the results obtained in a similar laboratory elsewhere which are promptly, effectively and appropriately communicated to the users of the service. The results must be unchallengeable. In this way the quality of the product or service can be guaranteed. The way laboratories achieve this quality service is through quality assurance which can be defined as the total process whereby the quality of laboratory reports can be guaranteed. Basically it comprises all the different measures taken to ensure the reliability of investigations. It seeks to minimise any variability in test results arising from such variables as the quality and education of staff, the quality of reagents, apparatus and specimens and the suitability of the techniques in use. Therefore, it is clear that quality assurance relates to the entire process of diagnosis of infection which starts and ends with the patient. It does not matter how well controlled the assay procedure in the laboratory is if an error has occurred at the pre or post examination phase which might have resulted in the wrong patient being identified, the wrong specimen taken, the specimen having been abused during transportation to the laboratory, a data entry error occurring at specimen reception, an incorrect interpretation of the results or the result having been sent to the wrong address. We generally focus the majority of our effort on the laboratory aspects of quality assurance, even though it is generally agreed that most errors occur during the pre and post examination phases. This document concentrates on the laboratory aspects of quality assurance, specifically the examination phase. However, laboratories are urged to pay as much attention to ensuring that pre and post examination phases are reviewed and control measures put in place to minimise the risk of errors occurring. Quality assurance in the examination phase is the collective term for several distinct procedures used to monitor the performance of all aspects of work in the laboratory. A quality assurance scheme should be used to identify procedural and technical problems, check the adequacy of current techniques, calculate the frequency of errors and ultimately to increase confidence in the procedures and techniques used and in the reports issued. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 8 of 28

9 A comprehensive quality assurance programme should be an integral part of the procedures of a diagnostic microbiology laboratory and is necessary for compliance with the Clinical Pathology Accreditation (UK) Ltd (CPA) standards and to maintain accreditation. Quality assurance procedures include: External quality assessment (EQA) 1 Internal quality assessment (IQA) 2 Internal quality control (IQC) 3 Equipment monitoring Audit The use of EQA, QC and equipment monitoring procedures should be mandatory, and participation in IQA and audit schemes is regarded as good laboratory practice (GLP) 4. Figure 1. Overview of Quality Assurance Quality assurance in diagnostic virology and serology Quality assessment Quality control Equipment monitoring Audit NEQAS EQAS ad hoc IQAS Kit controls Internal QC The use of these procedures is designed to increase confidence in the handling and testing of a specimen, in the validity of the assay results and in the final report. However, a laboratory based quality assurance scheme may only monitor the procedures over which the laboratory has control. Joint audits should be conducted to monitor all aspects of work associated with the diagnosis of infectious diseases. Quality assurance should be an integrated system in which results obtained from one of its constituent parts are confirmed by another, eg: The coefficient of variation of an assay determined in the QC scheme can be used to determine if there is a significant difference in the results obtained with anonymised samples submitted for testing as part of the IQA scheme IQA is a useful procedure when the quality of the control material is in doubt through the test being repeatedly out of control Violations of the warning Westgard rules (4 1SD or 10 x rules) indicating reduced assay sensitivity may be confirmed through equipment monitoring eg reduced Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 9 of 28

10 incubator temperature, indicating equipment failure (or conversely not confirmed and therefore indicating reagent deterioration) Assay and equipment used to perform assays should have undergone thorough evaluation before their introduction into routine use in the laboratory. The use of EQA, IQA or QC procedures will not improve the performance of assays or equipment that exhibit poor performance. It is important for the efficient conduct of quality assurance that data obtained in the scheme, comments on that data and action taken in consequence of results are recorded. This should be the designated responsibility of a senior and experienced member of staff. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 10 of 28

11 1 External Quality Assurance EQAS mainly provide comparisons between laboratories, between detection systems and allow comprehensive discussion of results and discrepancies. Clinical specimens and spiked samples are distributed from external sources or reference laboratories, to assess a broad range of techniques and assays performed in the clinical virology laboratory. The limited frequency of EQAS distributions and their clearly identifiable nature allows the potential for handling these samples in ways which exceed normal laboratory procedures eg handled by senior staff, repeated testing etc. Therefore, efforts should be made to ensure EQAS samples are handled in a routine manner. EQA is available for the majority of assays and organisms routinely tested. Table 1 gives details of a selection of EQA providers. Analysis of results and peer group comparisons will be performed externally. The results of distributions should be widely disseminated within the laboratory so as to encourage staff and to allow full discussions on any problems identified through the scheme. Table 1. External Quality Assessment Schemes UK NEQAS QCMD Labquality, Helsinki, Finland Internal Quality Assurance IQA is used to monitor all activities involved in the passage of specimens through the laboratory, starting from reception and ending in the dispatch of the final report. In the IQA scheme a number of specimens (representing approximately % of the workload) received in the laboratory are anonymised and resubmitted for testing (Figure 2). Specimen selection should be random in an IQA scheme for general serology but should reflect the proportion of the total workload submitted for each analyte. IQA schemes in which random specimens are resubmitted for monitoring culture, antigen detection, genome detection or electron microscopy etc may be more difficult to construct. Poor viability of the micro-organism sought, loss of viability or degradation of nucleic acid during storage, inadequate specimen volume or a low frequency of positive specimens, all contribute to complicating the analysis of results. Such IQA schemes are often supplemented by the use of spiked specimens. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 11 of 28

12 Figure 2. Internal Quality Assessment Scheme John Smith Blood sample Laboratory reception Routine testing under name Divide sample into 2 aliquots * Create new request form with name and sender replaced by IQA number Number sample and request form Number sample and request form Allocate test code Allocate test code Perform test Perform test Report results Report results Compare tests performed and results Analyse discrepancies and take action * Paired samples previously tested and stored under optimal conditions can be resubmitted for testing to monitor: The effect of different individuals allocating the tests to be performed Testing samples on different days by different operators To ensure that samples are not easily paired with the named aliquot during testing Discrepancies found between the results obtained with the 2 samples (original named sample and anonymised sample) should be recorded and a report sent to the senior member of staff in the relevant section of the laboratory asking for comments on the discrepancy with a request to retest the two samples in parallel if appropriate. Again, there should be a wide circulation of the results, and staff should be encouraged to discuss the discrepancies found. Results obtained in the IQAS should be summarised each month so that recurring problems or trends can be identified. See Appendix 1 for examples of documentation used in the IQAs. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 12 of 28

13 3 Internal Quality Control Quality Assurance in the Diagnostic Virology and Serology Laboratory Internal quality control (IQC) samples should be included in all assays performed in the laboratory and are used to validate test results. Results obtained with IQC samples are used in the decision making process to validate test kits and equipment. IQC samples can be international, national or local standard sera or pools of sera, well characterised in previous assays, and having values within clinically significant ranges. Acceptable limits are set using the standard deviation (SD) derived from 20 separate assay runs (see Appendix 2); daily values are plotted on Shewhart control charts and the Westgard rules (Table 2) are then applied to determine if the test is in or out of control 5,6. Select suitable control material Test the control material in 20 separate assay runs Determine the mean (target value) and SD of the control material Establish a control chart with the mean and +1SD, +2SD and +3SD delineated Include the control material in each assay run Determine the validity of the assay using the Westgard rules after plotting the result obtained with the control material on the control chart Table 2. The Westgard rules Westgard rules (A-C are warning rules, D-F are mandatory or alarm rules). A 12SD If one control measure exceeds the mean +2SD, control values in the previous run should be considered to rule out a trend. B 22SD This rule detects systematic errors. The rule is violated when two consecutive control values exceed the same* (mean +2SD or mean -2SD) limit. C 41SD This rule detects systematic error. The rule is violated when four consecutive values exceed the same*1 (mean +1SD or mean -1SD) limit. The run need not be rejected if this rule is violated but should trigger recalibration or equipment maintenance. D 13SD This rule detects random error. Violation of this rule may also point to systematic error. The assay run is considered to be out of control when one control value exceeds the mean +3SD. E R4SD This is a range rule and it detects random error. The rule is violated when the difference in SD between two control values exceeds +4SD. F 10X This rule detects systematic error. The rule is violated when the last 10 consecutive values are on the same side of the mean. Its violation often indicates the deterioration of assay reagents. *Consecutive values must be on the same side of the mean. The mean or target value, the coefficient of variation and the SD of the proposed control are calculated from the results obtained after testing the control material on 20 separate occasions (see Appendix 2). This process may be accelerated by testing 4 aliquots of the sample on each of 5 occasions; however, the mean and SD should be recalculated as a check after 20 assay runs. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 13 of 28

14 The concentration of specific antigen or antibody in the control material should be within the clinically significant range. For example, in an ELISA its OD value should lie within the linear part of the dose response curve. Do not use control material that is strongly positive and therefore saturates the assay. The values obtained are then used to set acceptable limits for the results obtained subsequently with the assay control. Shewhart plots or Levey-Jennings plots should be drawn, with the target value (mean), and the limit values of +1SD, +2SD and +3SD delineated, for each control used. Subsequent values obtained with the assay controls are plotted and the Westgard rules applied to determine the validity of each assay run. (See Appendix 2 for examples of documentation used with internal quality controls). The coefficient of variation (CV) is a measure of variability. This may refer to an assay, the equipment used to perform the assay or the assay operator. 4 Equipment Monitoring The performance of equipment (eg temperature of water baths, incubators and freezers) should be checked at regular intervals and records kept of the data obtained (see Appendix 3). Spectrophotometers and balances should be checked against available standards and recalibrated if necessary. Adjustable and fixed volume pipettes should be calibrated when new and the calibration checked at regular intervals. The introduction of computer controlled liquid handling devices, capable of performing immunoassays, offers the possibility of increased efficiency and reproducibility. As a regular or even intermittent use can lead to poor performance equipment should be monitored on a day to day basis and serviced, preferably by a qualified service engineer, at the intervals recommended by the manufacturer. Procedures and routines should be established for the control and maintenance of computer-controlled liquid handling devices. The scope of these procedures will depend on the automated system used. Recommended schedules of maintenance are usually an integral part of the manufacturer s instructions and QC procedures may be a part of the assay protocols, especially with random access or dedicated systems. Closed and random access systems may have these procedures included in the programming to monitor the performance of the equipment and the assays used. Operators should be familiar with these procedures and carry out monitoring according to the manufacturer s instructions. Open systems may have in built procedures for monitoring mechanical and electronic parameters but procedures for validating assay results may have to be programmed by the operator. In-house QA procedures should be devised for open access machines capable of running assays sourced from different manufacturers. Procedures can be created to measure the CV of sample dilution, sample addition and reagent addition by performing dummy runs with dye substituted for the clinical samples or reagents. Acceptable limits can be set after 20 runs in the same way as used for QC samples (+3SD limits). A minimum of 30 aliquots should be included in the QA procedure, constructed to mimic an assay currently performed on the equipment. A simple QA procedure can be created by copying a current assay protocol, renaming as QA, and modifying the protocol to perform the tasks required (see Table 3 for an example of two processes amended for QA purposes). Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 14 of 28

15 Error messages and faults occurring with automated equipment should be recorded in a log for each machine. The log should also include information on visits by maintenance engineers, routine servicing, remedial action and changes to software and hardware. Responsibility for monitoring equipment performance and carrying out cleaning and maintenance should be clearly defined in the SOP associated with the equipment. Action taken when a fault is detected should be recorded and reported to senior staff if the fault is recurring or repairs are required. Table 3. QA Procedures for use with Automated Liquid Handling Devices Test protocol QA protocol (dilution) QA protocol (reagent addition) Pick up sample Pick up dye Add dye (as reagent) Predilute in a tube Predilute in a tube Read Dilute in a plate Incubate Dilute in a plate Read Wash Add conjugate Incubate Wash Add substrate Add stop solution Read 5 Audit Audit is the process used to amend, evaluate and improve procedures in a systematic way in order to enhance quality. It is often used to highlight differences in those procedure(s) or to identify bottlenecks. The choice of tests performed on specific categories of patients or clinical syndromes when compared to a standard and specimen turn around times are examples of procedures chosen for audit in the clinical laboratory. There are different audit types which will be performed as part of the internal audit process: 5.1 Horizontal Audit A horizontal audit is where one element of the quality system is assessed. For example staff training or equipment calibration. This type of audit is a good way of ensuring that individual elements of the quality system are in place and functioning properly. However, it is less valuable at assessing how the system fits together. 5.2 Examination Audit In an examination audit, the assessor watches a test being performed. They can observe whether an SOP is being followed, if the member of staff is able to work Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 15 of 28

16 competently and safely. It is an opportunity to talk to a staff member to ascertain whether they are satisfied with their training, have the correct level of supervision and are aware of the bigger picture for example the impact that their work has. 5.3 Vertical Audit A vertical audit tracks a sample from receipt to the issue of a result (or the other way around), checking each operations associated with processing of the sample. As well as tracking the sample itself, a vertical audit should include aspects such as the training record of personnel involved in testing the sample, records of equipment and reagents used to perform assays, IQA and IQC results relevant for the time the test was performed etc. The advantage of the vertical audit is that it covers all areas of work and shows how the process operates as a whole. An audit of specimen turn around time involves the following steps and is illustrated in Figure 3: Select approximately 20 consecutive specimens requiring the same test or combination of tests Record the time taken to perform each task as the specimen passes through the laboratory Calculate the mean and median time for each task and identify the bottlenecks If necessary, make changes to procedures or the frequency of testing Repeat the audit several months after the changes have been instituted Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 16 of 28

17 Figure 3. Areas of responsibility that may be identified through joint audit Area of responsibility Audit area Outcome Ward, clinic, general practice etc Patient handling Correctly identified Medical and nursing staff, transport Specimen collection Proper container, adequately labeled, transported and stored Laboratory Analysis Reliable, results calculated correctly and expressed without ambiguity Laboratory Report Destination correct, interpretation given when required Transport, medical and nursing staff Delivery Time taken, report entered in notes The documentation required for an audit will depend on the procedure being audited. An example of documentation associated with an audit of specimen turn around time is shown in Appendix 4. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 17 of 28

18 Appendix 1: Documentation used in IQA 1 Record Sheet 1.1 Internal Quality Assessment - Virus Serology Original IQA 26 Original IQA 27 Original IQA 28 Laboratory No Date 12/6/94 12/6/94 18/6/94 18/6/94 20/6/94 20/6/94 Assay CFT Flu A Flu B <8 <8 Chlamydia species <8 <8 Coxiella burnetii <8 <8 Adenovirus Mycoplasma pneumoniae Measles Legionella serology IF RMAT <16 <16 Rubella serology SRH IgG ELISA IgM ELISA Hepatitis serology HAV IgM HAV total HBsAg Anti-HBs 96 miu/ml 106 miu/ml Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 18 of 28

19 Examples of three discrepancies are shown in bold: In IQA 26 a four-fold difference in Flu A titre between the named and anonymised samples In IQA 27 results of the anti-hbs assay that might have led to different clinical management (booster dose of vaccine required compared with immune to HBV) In IQA 28 a significant measles antibody titre was detected in only one of the duplicate samples Discrepancy reports should be produced for all three samples. 2 Discrepancy Reports 2.1 Non-significant Discrepancy INTERNAL QUALITY ASSESSMENT SCHEME DISCREPANCY REPORT Date:..18../..06../..94 IQA Number:.26.. Original Named sample Anonymised IQA sample Number: Test: Resp. CFT.. Number: Test:.Resp. CFT.. Report:....Influenza A 64 Report:....Influenza A 16 Comment:.. On repeat both samples. had CF titres of 32. Date:..02../..07../..94 Initials: JJG.. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 19 of 28

20 2.2 Diagnostically Significant Discrepancy INTERNAL QUALITY ASSESSMENT SCHEME DISCREPANCY REPORT Date:..24../..06../..94 IQA Number:.28.. Original Named sample Anonymised IQA sample Number: Test: Rash group.. Number: Test:.Rash group.. Report:.. Measles CF 8 Not significant.. Report:.. Measles CF 128 May indicate recent. Measles infection Comment:.. On repeat both samples. had CF titres of 8. Date:..04../..07../..94 Initials: JJG.. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 20 of 28

21 2.3 Discrepancy Leading to Different Advice INTERNAL QUALITY ASSESSMENT SCHEME DISCREPANCY REPORT Date:..24../..06../..94 IQA Number:.27.. Original Named sample Anonymised IQA sample Number: Test: anti-hbs.. Number: Test:.anti-HBs.. Report:....96mIU/mL Give booster dose of vaccine Report: mIU/mL. Indicates immunity to HBV Comment:.. Both results within... CV for this assay... Date:..04../..07../..94 Initials: JJG.. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 21 of 28

22 Appendix 2: Documentation Used in IQC 1 Determining the Target Value and SD of a QC Sample and Setting Acceptable Limits for its Use Assay run Antibody concentration (Arbitrary units/ml) QC data: Number tested = 20 Mean (target value) = AU/mL SD = 3.33 AU/mL Acceptable range (+3SD) = AU/mL Therefore, for subsequent assay runs to be valid, the result obtained with the QC sample should lie within 56.9 AU/mL and 76.9 AU/mL. A Shewhart plot should now be constructed with the mean or target value and +1SD, +2SD and +3SD values delineated and the result obtained with the QC sample should be plotted after each assay run. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 22 of 28

23 2 Examples of Control Charts Quality Assurance in the Diagnostic Virology and Serology Laboratory HAV IgM ELISA: QC 1994 LMR assay 4 Q C / C O r a t i o S D 1.10 Assay run The chart above shows examples of random errors or operator errors. The 13SD rule has been violated on two occasions. The results obtained in both these assay runs are invalid. Note: The results of the QC sample are plotted as a ratio of the OD of the QC sample: OD of the assay cut-off value. This compensates for the small differences in assay performance seen day-to-day. Assays incubated at room temperature are particularly susceptible to small, but acceptable, changes in performance. The QC charts shown below are an example of systematic errors. This change in assay performance was associated with a new batch of reagents and was detected with both controls. Violations of the 10x rule were detected with both controls and the 41SD with the intermediate control. Note: The QC procedures indicated an increase in the sensitivity of the assay. If this is acceptable then the QC limits should be recalculated from control values obtained with this batch of reagents. This process can be speeded up by testing 4 aliquots of each control in 5 assay runs. A recalculation should be made after the results of 20 runs are available in order to check the accuracy of the values used. -3 Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 23 of 28

24 anti-hbs QC: intermediate Systematic error 7 A n t i b o d y S D c o n c Assay run anti-hbs QC: low Systematic error 4 A n t i b o d y S D c o n c Assay run New batch introduced Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 24 of 28

25 Appendix 3: Equipment Control Sheets Temperature control sheet Inventory number: CAMB Incubator type (CO 2 or general purpose): CO 2 Water bath: Refrigerator: Set temperature: 37 C +2 C (5% CO 2 ) Date Temp C Action/ initials JJG JJG JJG JJG replaced CO 2 cylinder JJG JJG JJG JJG JJG Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 25 of 28

26 Appendix 4: Documentation and Data Handling Associated with Audit of Specimen Turnaround Time 1 Auditing of Processing Time of Specimens in the Clinical Virology Laboratory Complete this form for every sample enrolled in the audit Sample Lab No.:. Test code:. Stage Date Time Sample taken Arrival at lab. Coding for test Registration Work sheet generated 1 st test started Last result entered on computer Result authorisation Report sent out 24hr clock Time elapsed Days - Hours Differential time Days - Hours Calculate the mean and median time taken to perform each task and the range for each test code. In the example given above the data entry fields are designed to reflect workflow. This may differ from laboratory to laboratory. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 26 of 28

27 Appendix 5: Statistics Used with IQC 1 Mean The mean is defined as the arithmetic average of a set of data points. It is expressed as: x i n Mean = x where i = each data point n = the number of data points in the set The mean identifies the target value of a set of QC data points. 2 Standard Deviation The standard deviation (SD) is a measure of the dispersion of data points above and below the mean and is used to set acceptable limits for values obtained with IQC samples. SD = 2 ( x ) n 1 ( x ) where ( x 2 ) ( x ) 2 = the sum of all data points squared n n 2 = the sum of the squares of each value of x = the total number of data points in the set Quality control data exhibit a normal distribution, therefore: 68.3% of values are within +1SD from the mean 95.9% of values are within +2SD from the mean 99.7% of values are within +3SD from the mean 3 Coefficient of Variation The coefficient of variation (CV) is a measure of the variability of an assay and is expressed as a percentage. CV = (SD/ mean)(100) The CV is useful for determining whether values, obtained with duplicate samples, which lie either side of an arbitrary cut-off value are within experimental error. For example, in anti-hbs antibody determination a value of 105mIU/mL would indicate satisfactory immunity whereas a value of 98mIU/ml would signal the need for a booster dose of vaccine. If the assay has a CV of 10%, both values would be acceptable. Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 27 of 28

28 References 1. Snell JJS. External Quality Assessment. In: Snell JJS, Brown DFJ, Roberts C, editors. Quality Assurance Principles and Practice in the Microbiology Laboratory. London: Public Health Laboratory Service; p Gray JJ, Wreghitt TG, McKee TA, McIntyre P, Roth CE, Smith DJ, et al. Internal quality assurance in a clinical virology laboratory. I. Internal quality assessment. J Clin Pathol 1995;48: Gray JJ, Wreghitt TG, McKee TA, McIntyre P, Roth CE, Smith DJ, et al. Internal quality assurance in a clinical virology laboratory. II. Internal quality control. J Clin Pathol 1995;48: Sharp IR. Quality audit and quality system review in the laboratory. In: Snell JJS, Brown DFJ, Roberts C, editors. Quality Assurance Principles and Practice in the Microbiology Laboratory. London: Public Health Laboratory Service; p Westgard JO, Barry PL, Hunt MR, Groth T. A multi-rule Shewhart chart for quality control in clinical chemistry. Clin Chem 1981;27: Westgard JO, Groth T, Aronsson T, Falk H, de Verdier CH. Performance characteristics of rules for internal quality control: probabilities for false rejection and error detection. Clin Chem 1977;23: Quality Guidance Q 2 Issue no: 6.2 Issue date: Page: 28 of 28