Florida Department of Environmental Protection. Field and Lab Audits

Similar documents
Quality Assurance / Quality Control (QA / QC)

Will Accrediting Authorities allow PT Providers to reissue a result report if the lab identifies a reporting problem? Sponsor: Bill Hall Resolution:

STEPS TO DMR-QA SUCCESS. BY: Marcy Bolek

Quality Assurance / Quality Control. Stephanie Sunderman-Barnes DEAR / WQAP / WMS September 2018

2010 PNCWA Conference Bend, OR

Preparing for an On-Site Audit

TNI s NATIONAL ENVIRONMENTAL LABORATORY ACCREDITATION PROGRAM (NELAP) Changing to the 2009 Standard

CT DEP Reasonable Confidence Protocols. Glen Breland, VP Member CT DEP QA/QC Workgroup

How to use the TNI standards to meet essential QC elements in the MUR

QAM-A-103 Data Reporting by the Laboratory Manager

Medical Cannabis Laboratory Approval Program

Data Validation Report

2 QUALITY ASSURANCE PROGRAM

Edward F. Askew PhD Askew Scientific Consulting

GROUNDWATER MONITORING QUALITY ASSURANCE/QUALITY CONTROL PLAN LOWER YAKIMA VALLEY GWMA INITIAL CHARACTERIZATION

ANALYTICAL REPORT. Attn: Richard Boelter

Challenges and roadblocks adopting new laboratory Accreditation Standards. Ron Coss Orange County Sanitation District

RE: Water Sampling at Filter-Installed Point at the Bleshman Building, Bergen County Special Services School District

Microbac Laboratories, Inc.

ANALYTICAL REPORT. Attn: Richard Boelter

ANALYTICAL REPORT. Attn: Richard Boelter

Back TO Basics: Sample Collection & Handling

EPA REGION 8 QA DOCUMENT REVIEW CROSSWALK Entity (grantee, contract, EPA AO, EPA Program, Other)

November Prepared By: :

November 14, 2018 Project No:

A Small Laboratory s Experience

How to Rock your Lab Audit!

Page 1 of 26 Revision October 2001

Edgewater Environmental Research And Education Center

QAM-Q-105 Corrective Actions

Electronic Data Validation Using the Locus Environmental Information Management System. Tricia Walters Locus Technologies

Pebble Project Environmental Baseline Studies Technical Summary. APPENDIX A. Analytical Quality Assurance/Quality Control Review

The 2016 TNI Laboratory Accreditation Standard. November 15, 2017

ADEM LTF Contractor QAPP Review Checklist

Total 155 files quick download in editable form by e delivery

Parsons. APPENDIX E Laboratory Data Validation Reports Calscience Environmental Laboratories, Inc.

Laboratory Report This Page is to be Stamped Introduction: This report package contains total of 8 pages divided into 3 sections:

Michigan Water Environment Association Environmental Lab Fraud

Total 155 files quick download in editable form by e delivery

recovery and between 0% and 10% relative standard deviation.

Analytical Procedures and Methods Validation for Drugs and Biologics

Oklahoma Checklist (OK)

Ainvolved in many aspects of project planning, sample collection,

Soil Sampling and Analysis Report

Implementing the New TNI Standard. April 7, 2011

The 2016 TNI Standard. Module 2: Quality Systems Module 4: Method Validation Module 5: Microbiology

Required Documentation for PA State (Chapter 252) Accreditation

Available Laboratory Methods for Soil Gas Analysis

QUALITY MANUAL FOR Department of Materials Testing and Evaluation. Miami-Dade County Public Schools

Common Issues in Qualification and Validation of Analytical Procedures

LAB ID and/or LABORATORY NAME: ASSESSOR NAME: CHARCOAL LIQUID SCINTILLATION by Scintillation Counting. Method Number: ELAP method number 7033

Sample ID: NTF (Byproduct - Water) Sampled: 06/24/10 10:00 General Chemistry Parameters

Volatile Organic Compounds in Solids PBM

CRITICAL ASPECT ANALYTICAL TEST REVIEW

Quality Assurance Project Plan Lower Neches River Basin/ Neches-Trinity Coastal Basin

Verifying the Reliability of EPA Method 314 to Measure Perchlorate at Sub ppb Levels vs New EPA Method Options

D111: DEMO OF EQMS DOCUMENT KIT Based on QMS and EMS Price 399 USD

Method 0040 for Volatile Total Organics as Unspeciated Mass in Stack Gas Emissions

A Summary of the Changes in the Newly Promulgated GC/MS Method for Volatile Organics in Wastewater

Adopted by GCP Inspectors Working Group (GCP IWG) 29 November 2017

QUALITY ASSURANCE/QUALITY CONTROL

QUESTIONABLE PRACTICES IN THE ORGANIC LABORATORY: PART II

Validation of Analytical Methods used for the Characterization, Physicochemical and Functional Analysis and of Biopharmaceuticals.

Instructors/Trainers. Groundwater Sampling Training Workshop. Objectives of GW Sampling. Agenda Estimated time (minutes) Course Objectives

WARNING LETTER AUG 3, 2016

Development and Implementation of a Quality Assurance Program for the TVA Kingston Ash Recovery Project

HACCP audit checklist

If you have any questions concerning this report, please feel free to contact me. REPORT OF LABORATORY ANALYSIS

Laboratory Challenges with TNI Implementation and Laboratory Audits

Update on Numeric Nutrient Criteria Development

QAM-Q-113 Responsibilities of the Laboratory Quality Assurance Officer

Neilson Research Corporation received 42 sample(s) on 6/18/2018 for the analyses presented in the following report.

Why ISO Accreditation Matters; Not just to the Laboratory and its Immediate Client But to All Stakeholders

Ensuring Data Integrity

Investigation Requirements for Ethanol-Blended Fuel Releases Petroleum Remediation Program

January 26, Clean Harbors Environmental Services, Inc. 32 Bask Road Glenmont, New York 12077

Nonylphenols and Ethoxylates in Water by LC/MS/MS

TNI Assessment Forum

Chief Joseph Dam Sediment Quality Sampling and Analysis Plan, 2004

Regulation 85. Navigating Regulation 85 Part II Lab Methods and Data Submission

2018 Annual Groundwater Monitoring and Corrective Action Report

ASTM D19.06 SUBCOMMITTEE

GFA International, Inc.

September 15, 2017 Project No:

MODULE 2A GENERAL MANAGEMENT SYSTEM REQUIREMENTS

WIN DOCUMENTATION ELEMENTS FOR DEAR SAMPLERS. ROCfest May 18, 2017

21CFR Ventilation, air filtration, air heating and cooling. 21CFR211 Details, Details. 21CFR Equipment Cleaning and Maintenance

Protocol for Collecting Samples of Cannabinoid Concentrates, Extracts, and Products

Analytical Procedures and Methods Validation for Drugs and Biologics

April 3, 2008; Revised October 1, Port of Seattle - Terminal 117 Soil and Water QC Samples Analytical Resources, Inc. data February 2008

Certificate of Analysis FINAL REPORT

STATIONARY SOURCE AUDIT SAMPLE PROGRAM [VOLUME 1, MODULE 3] GENERAL REQUIREMENTS FOR PARTICIPATION IN THE TNI STATIONARY SOURCE AUDIT SAMPLE PROGRAM

Source Tracing Event for Outfall 2771 February 10, 2014

METHOD VALIDATION: AN ESSENTIAL COMPONENT OF THE MEASUREMENT PROCESS. Presented by : Jerry Parr, The NELAC Institute

Control Charts and Trend Analysis for ISO 17025

Reporting Limit (RL) Presenter

Appendix H. QAPP. Appendix

Checklist of Common Critical Data Issues:

Prequalification of Medicines Programme

Strategies for Risk Based Validation of Laboratory Systems

Transcription:

Florida Department of Environmental Protection Field and Lab Audits

What is an Audit? Check that data and processes are of sufficient quality for environmental decision-making Data Quality Objectives (DQOs) met? SOPs, methods and project-specific requirements followed? Documentation sufficient to recreate event? Opportunity for learning and tweaking processes Page 2

Audit Overview Program and project DQOs and Data Quality Indicators (DQIs) as audit criteria QA rule requirements (62-160 FAC) Program rules and requirements DEP SOP requirements (DEP-SOP-001/01, DEP-SOP- 002/01 and DEP-SOP-003/11) Project QA plan requirements, if applicable (contracts and grants) Analytical method requirements for labs NELAC (or TNI) requirements for labs DEP data usability criteria (DEP-EA-001/07) Page 3

Audit Overview - Field Field audits System (evaluation of procedures, equipment and documentation) Performance (evaluation of sampler s technique, knowledge of DEP SOPs and record-keeping) Project (tracking specific sample results through field records for data validation and usability assessment) Page 4

Audit Overview - Laboratory Laboratory audits System (evaluation of procedures, equipment and documentation) Usually performed by DOH ELCP Project (tracking specific sample results through lab records for data validation and usability assessment) Page 5

Lab Audits - DoH vs. DEP DoH looks at the Big Picture Is there a Quality System? Do processes and procedures meet NELAC (TNI) Requirements? Lab certification sets the minimum bar DEP looks at specific data- How effective are the system and processes in generating data for a specific purpose? Typically tracks selected data through documents Are data usable for a specific purpose? Page 6

Audit Overview - Project Project audits including data review Tiered approach determines depth or degree of evaluation Routine data review for compliance with permits and rules can incorporate lower level tiers of audit criteria Examples: Date checks for holding times Method number checks for approved methods Result checks for applicable reversals Data qualifier code checks for QC problems MDL and PQL value checks for DQO attainment Page 7

Audit Overview Internal vs external Internal audits Required for all sampling organizations (FA 4200) System, performance and project audits recommended Coordinated by your QA Officer Assistance with audits offered by AEQAS Internal Quality of Science reviews conducted by AEQAS for DEP internal program quality system evaluation, if requested External audits (conducted by DEP) Authorized by QA rule Applicable to any person or organization involved in data generation or support activities Page 8

Records to reconstruct history of each sample Documentation is critical to assessing data usability Can an independent auditor reconstruct how the sample was collected, received in lab, processed and analyzed, test results evaluated, reported, etc. Can the calibration condition of field-testing instruments used to measure sample characteristics be verified? Page 9

Audit Checklists Generic sampling checklists in FA 1000 (DEP SOPs) and on QA Officer Resources page: http://www.dep.state.fl.us/water/sas/qa/officer.htm Lab checklists for selected analytes on QAO webpage Other checklist versions available from AEQAS Serve as on-site reminders of audit criteria Provide detailed feedback as part of audit report May need project-specific checklist Derive from project DQOs and DQIs Page 10

Field Checklists DEP SOPs are basis for typical audit Sampling collection Sampling equipment Sample preservation Field testing calibrations Documentation of field-related activities Audits of project-specific data records will track the selected samples or data points Documentation per QA rule must be available Page 11

Lab checklists Analytical methods, NELAC (TNI) standards, DEP-EA- 001/07, Program-specific DQOs, contract requirements, etc. are basis for checklists Checklists require tailoring to the above for given audit and data selected Data records audits track individual data or samples through the lab records Systems audits may also inspect available equipment & supplies, record-keeping structure, and conduct analyst interviews Documentation per QA rule requirements must be available for audits Page 12

http://www.dep.state.fl.us/water/sas/qa/dr-audits.htm

Audit Report Complete relevant checklists Send preliminary report of audit findings which clearly outlines major and minor deficiencies and recommendations Include checklists (if appropriate) Send to audited and/or responding entity Copy to associated DEP staff Ask for response and strategy for addressing issues Indicate mandatory corrective actions Consider responses and prepare final audit report Note acceptable corrective actions Document any data usability recommendations Page 14

Corrective Action Work with the audited party to develop corrective actions. Examples: Additional staff training Procedure changes Record-keeping changes Auditor and audited party should agree to actions (detailed in audit report) Follow up with audited party to ensure compliance and understanding Additional audit may be required Page 15

Usability Recommendation Process Evaluate each sample-analyte combination per applicable criteria Assess criticality of any failures Identify mitigating data and facts Determine qualification status of analytical result Tally number and percentage of qualified sample results for analyte Characterize analyte (field and) laboratory system with respect to selected audit data Page 16

Usability Recommendation Process (cont d.) Evaluate response from audited entity Usually documented in final audit report Audit more data records as necessary Repeat evaluation process for additional data Recommend qualified or unqualified use of analyte data for the audited calendar ranges or project, permit, etc. Document usability recommendation for individual sample results or for larger set of data Resolve data management issues database records Page 17

Data Records Selection Criteria Driven by program, project or site, permit, contract, problem, etc. What is purpose of audit? Examples: Assess overall usability for a dataset? Determine usability for specific data points? Evaluate compliance data for a permit? Investigate a known problem with the data? Respond to fraud allegation or other complaint? Follow-up to verify corrective actions? Page 18

Data Records Selection Criteria - Example Identify major data generators or other category of generators of ambient monitoring data used for surface water impairment assessment Choose relevant analytes specific to audited entity Pick sample records for low-med-hi concentrations for each audit year, including nondetects Include data records for 5-year range or other date span Target minimum of 15 audit samples Page 19

Florida Department of Environmental Protection General Audit Topics & Problems Encountered During Audits

Field Records Audit Topics Sample collection information Location, date, time, analytes, preservation, matrices, collection equipment & procedures Groundwater purging information Field testing information Measurement results Calibration verification DEP SOPs are audit criteria Basis for checklists for performance and documentation Page 21

Lab Records Audit Topics Sample preservation & holding time Instrument calibration & verification Analytical sensitivity (MDL & PQL) Analytical precision & accuracy Blank contamination control Sample concentration bracketing Method-specific procedures Data reduction, reporting & transcription Page 22

Holding Times Criteria established in DEP SOP tables per DEP QA rule & 40 CFR Part 136 Field record is audited to verify collection date and time Lab analytical record is used to verify sample prep and analysis date and time, as applicable Page 23

Evaluation of Sample Preservation Criteria established in DEP SOP tables per DEP QA rule & 40 CFR Part 136 Field records should indicate type(s) of preservation for each sample container Includes field check of required preservation ph Field info transmitted to lab should also indicate any other specific preservations Page 24

Evaluation of Sample Preservation (cont d.) Lab records are evaluated independently of field records ph preservation and chilling is not verified upon laboratory sample receipt Records are checked for any indication of verification of sample ph and temperature Check with ph paper (ph value recorded or noted within specified range) Received in ice OK only for same-day delivery Page 25

Calibration Evaluation - Examples Minimum number of standards used Corr. Coeff. 0.995 where applicable Second-source standard used to verify initial calibration All verifications within specified range (method or other criterion) Sample and QC results bracketed between acceptable calibration verifications Page 26

Spike Evaluation LCS usually takes precedence over matrix spike Matrix spike evaluated where required by method or other criterion Matrix spike evaluated for parent sample Recovery calculations verified by auditor using raw data Example default recovery targets: 90% - 110% for nutrients 85% - 115% for metals (LCS) 70% - 130% for metals (Matrix Spike) Page 27

Auditing of Data Reduction & Reporting Final results are verified according to method-specified calculation formulas Raw sample responses from the instrumentation are compared to standard responses Specific QC evaluations are applied to the analyses, for example: Metals, microbiology and BOD QC per method requirements Reports or databases are compared to audit results Page 28

Field Issues - Field Sampling Performance Not having proper, functioning equipment out in the field Not following DEP SOPs Sloppy contamination control and sample handling (e.g., cross-contamination of equipment between samples, agitation/aeration of VOCs) Not documenting required information Page 29

Field Issues - Field Testing DEP SOP calibration verification protocols for test instruments were not followed. Pre-deployment checks on field test and sampling equipment were not conducted. Equipment does not work during sampling event Standards used for calibrating field measurements did not bracket sample concentrations. Field meter calibration criteria not met Calibration acceptance criteria were not being assessed (at all) Page 30

Field Issues - Documentation Various calibration records incomplete Including linkage between meter calibration records and sample measurement data Sources for field calibration standards were not documented Calibration verifications not documented Page 31

Field Issues- General Documentation Sample preservation and verification (including filtration) not recorded Groundwater purging records incomplete Lack of clear, unique identification of samples No documentation of sampling equipment used to collect samples Page 32

Lab Issues Procedure and documentation Lab Procedural & Documentation Issues Lab doesn t check received field samples for proper preservation (i.e. doesn t document the check) Lab QC data generated at concentrations not relevant to the project sample concentrations (spikes and duplicates) Lab doesn t evaluate the results of analytical blanks for impact on sample results Critical aspects of sample preparation are not recorded (e.g., filtration times for chlorophyll, method-required processing steps for other tests) Page 33

Lab Issues data reporting Lab data reporting Reported MDLs and PQLs not low enough to meet project or program DQOs Lab QC data is selectively used to report unqualified sample results Sample results are reported without qualification whose concentrations are outside of the calibration range Sample results reported with missing or inappropriate data qualifier codes Page 34

Problems With Calibration Range Samples outside of calibration range reported without qualification Standards used for the calibration curve define the calibration range PQL (LOQ) concentration should be bracketed by curve Lowest calibration standard may be at the lab PQL (LOQ) Results above highest standard are estimated values QC samples are evaluated per above Blanks, spikes and duplicates outside of range are estimated results Page 35

Blank Contamination Blanks not adequately evaluated Sample results reported without qualification Blank results selectively used or ignored Blanks are evaluated based on chronology and batching Before and after sample result (relative to analytical run times) Prepared blanks in prep batch 10% criterion applied to contamination level Sample results are evaluated against any applicable field-qc blank results Page 36

MDL & PQL Problems Lab MDL or PQL is too high for indicated use Data not usable for non-detect samples Estimated values for sample results <PQL may be problematic Elevated MDL/PQL from unnecessary sample dilutions not usable Page 37

Spiking Problems LCS analyzed concentration too high relative to sample concentration range or desired PQL Applicable for non-detect results Evaluation of recovery near PQL (if appropriate QC check used) Spike concentration not appropriate to samples Matrix spike analyzed concentration too high or too low relative to unspiked parent samples Evaluated on case-by-case basis for the parent sample and analytical run Spike not digested or processed with samples Page 38

Precision Evaluation & Problems When lab controls precision using duplicate results <PQL RPD or %RSD may not be representative LCS duplicates take precedence over MS duplicates and sample duplicates Sample duplicates evaluated for parent sample Page 39

Selective Use of Calibration & QC Results Blanks, spikes, duplicates and standards are audited against the run-time and batch relationships to samples Lab ignores failed results; uses good results to pass verifications or QC measures Lab selectively uses chronologically related data to pass verifications or QC measures Lab selectively uses batch-related QC results Lab uses data NOT justifiably related to samples to pass verifications or QC measures; e.g., re-runs on different days Page 40

Qualified Data Reporting Laboratory fails to report results with appropriate qualifiers DEP QA rule data qualifier codes Sample results are independently qualified by auditors based on audit results Audit qualified results are compared to reported data or database records Page 41

Audits Examples and Resources DEP Audit reports page http://www.dep.state.fl.us/labs/cgi-bin/reports/search.asp QAO Resources page Data Review & Audits DOH certification inspection checklists http://www.dep.state.fl.us/labs/dohforms.htm Additional checklists from AEQAS Page 42