Assessment of Processes for Implementation of Automated Data Analysis Systems

Size: px
Start display at page:

Download "Assessment of Processes for Implementation of Automated Data Analysis Systems"

Transcription

1 Assessment of Processes for Implementation of Automated Data Analysis Systems James Benson Technical Executive, EPRI Ratko Vojvodic Technical Consultant, Areva 35 th EPRI SG NDE and Tube Integrity Workshop July 18-20, 2016

2 Introduction and Background EPRI SGMP began a project in 2015 to develop a technical basis for industry guidance for development, configuration, implementation and verification of single party auto data analysis Project is not system specific, it considers: Generic architecture and essential functions of the automated analysis process applicable to any automated system Roles and responsibilities in the data analysis process Vocabulary, implementation modes and the role of human factor Potential common mode failures and unanticipated signals Potential benefits of system redundancy Conventional and unconventional detection/classification techniques Performance requirements and metrics Auto analysis preparation and implementation 2

3 Expected Project Benefits Provide information relative to all aspects of automated data analysis to allow a full understanding of challenges and available options Provide a technical basis for development and implementation of general and site specific guidance and procedures related to the automated data analysis with the emphasis on a single data analysis process Assemble a database that will help when developing, configuring, verifying and testing the automated systems Provide practical guidance and instructions in the form of a check-list to be used by utility personnel and independent oversight when verifying automated system configuration and assessing the performance 3

4 Project Status Phase 1 COMPLETENESS as of June 2016 Identify fundamentals of the automated data analysis process Provide vocabulary of terms to establish common ground Identify and assess possible technical solutions for Single Party /Single Platform Evaluate the need for redundancy and diversity Identify common mode failures and a means to eliminate them Establish basis for performance demonstration and verification Analyze the feasibility and impact of raising/adding requirements Recommend performance demonstration process 4

5 Project Task Phase 1 - FUNDAMENTALS Identify the fundamentals and a path toward a single-party / single-platform automated analysis Standardization and structured approach are needed for auto analysis to meet technical and organizational requirements to successfully complement and/or replace manual analysis Generic architecture and essential functions of the automated data analysis process are identified and analyzed Information is assembled to help in the understanding of the auto analysis process, its components, their interactions and tasks to be performed in preparation and implementation of auto analysis Roles and responsibilities in the data analysis process are identified and analyzed focusing on automated analysis Standardization and structured approach are needed 5

6 Project Task Phase 1 - VOCABULARY Provide the vocabulary of terms to establish common ground in the analysis architecture and process Terminology used in the data analysis process is evaluated and initial vocabulary is provided Terminology used within vendor organizations is covered to a certain level Functional architecture differs from one system to another Analysis methods that exists within organizations differ Further work will refine the meaning and definitions of some terms and will reconcile terminology used by vendor organizations and within automated systems Vocabulary establishes common ground and eliminates ambiguities 6

7 Project Task Phase 1 TECHNICAL SOLUTIONS Assess possible technical solutions for challenges associated with single-party auto data analysis Implementation modes identified in current industry guidance and ones not currently allowed are identified and evaluated disclosing actual technical characteristics of each mode Common terminology, along with a technical basis, enables recognizing technical and organizational characteristics of a particular data analysis option and its intent, while removing the perception that may be associated with a particular term This study revealed human factor as being a result of a much larger and significantly more complex set of aspects than typically considered Implementation modes clarified / Human factor aspects identified 7

8 Project Task Phase 1 - REDUNDANCY Evaluate the need for the redundancy and diversity assuring a POD that is equal to or better than the current two-party analysis process Technical basis of the redundancy, diversity and duality in general are evaluated Pros, cons and adequacy of the diverse redundancy related to detection are evaluated with particular attention to the impact on POD Methods for the verification and demonstration of redundancy embedded in the data analysis process in different phases (e.g., detection, classification) may be complex and include review of performance obtained from AAPDD, SSPD, field experience and available documentation Diverse redundancy improves the data analysis process 8

9 Project Task Phase 1 - REDUNDANCY It seems that the benefits of and the need for the duality in the data analysis process are recognized resulting in keeping the duality as one of the keystones in the examination guidelines Multiple algorithms, complementariness, different processes, independence these are some of the terms and concepts included in this study as well as in current industry guidance Main intention is to assure diverse redundancy which tends to have positive impact on the detection/classification/reporting performance, better overall comportment in the presence of unexpected and/or unknown signals/degradation and result in less stress applied to each individual system/tool Criteria for the evaluation of diverse redundancy needs to be developed; simple duplication is not beneficial 9

10 Project Task Phase 1 PERFORMANCE DEMONSTRATION Establish principles for performance demonstration, documentation and verification of the data analysis process Detailed analysis of the performance demonstration requirements for the analyst, configurator and automated data analysis is performed Distinction between the automated data analysis system and automated data analysis process is identified resulting in different performance demonstration requirements being appropriate Generic (AAPDD) and specific (SSPD) performance demonstration is evaluated for existing, potential, unexpected and unknown mechanisms providing recommendations for preparing, administering and grading Auto analysis process may need modified performance demonstration 10

11 Project Task Phase 1 COMMON MODE FAILURES Common mode or common cause failures in data analysis refer to the misses present in multiple algorithms, processes, configurations or systems as the result of events or malfunctions which make misses statistically not independent Technical origin utilization of the techniques that are based on the same mathematical or physical principles yet claiming multiple and independent behavior (detection), non-redundant technical segment of the process not confirmed as fail-safe (common setup, locating), same latent error in multiple configurations or systems (incorrect ROI definition) Organizational origin lack of training to recognize particular type of signal, analysts not trained in utilization of the analysis software, involuntary sharing information, misinterpreting the guidelines, ETSS 11

12 Project Task Phase 1 COMMON MODE FAILURES Identify data analysis processes prone to common mode failures and propose a means to eliminate them Automated setup and automated locating are functions that are typically shared between multiple systems, multiple configurations or multiple teams Common mode failure events or factors are identified as: Deficiencies in the training and/or testing programs Lack of separation enabling involuntary sharing of the information Similarities and non-redundant technical solutions in declared-tobe redundant methods (detection, classification), non-redundant technical processes not confirmed as fail-safe (setup, locating) Methods to eliminate common mode failures are identified Common mode failures are preventable technically/organizationally 12

13 Project Task Phase 1 POD/CL REQUIREMENTS Analyze the feasibility and possible impact of raising the POD/CL requirements and/or adding further requirements Technical basis for 80/90 POD/CL is evaluated including technical and historical circumstances under which these requirements were established Adequacy of 80/90 particularly for the case of automated analysis is evaluated pointing to the benefits of raising these requirements Training and testing sample size and content are evaluated pointing to the need for larger sample with well defined distribution Several alternatives to 80/90 POD/CL and possible impact on both manual and automated analysis SSPD process are analyzed with the particular emphasis on 90/50 POD/CL Auto analysis may need larger sample and higher POD requirements 13

14 Project Task Phase 1 POD/CL REQUIREMENTS POD/CL of 90/50 may be considered as an alternative to currently applied 80/90 based on the following: 90/50 represents more stringent requirement particularly when larger test sample is present (less impact of CL so the POD gets closer to the detection % Larger test samples can be efficiently handled by automated systems There is no fundamental difference between 80/90 and 90/50 when smaller samples are considered so 90/50 can also be applied to manual analysis that can handle only smaller samples due to the practical reasons (time available for testing) Automated systems already perform significantly better than 80/90 and typically outperform 90/50 as well 90/50 is already applied elsewhere (see MIL-HDBK-1783B, paragraph A requirement guidance) 14

15 Project Task Phase 1 POD/CL REQUIREMENTS Comparison between 80/90 and 90/50 15

16 Project Task Phase 1 OTHER Phase 1 work resulted in recommendations relative to the several aspects of the data analysis process Use of EPRI recommended raw data format and final report format Standardization of the vocabulary used in industry Value of diverse redundancy (independence) in the process Possible improvements in AAPDD and SSPD process Consider raising the existing POD/CL requirements making them more appropriate for the automated analysis without negative impact on the manual analysis Standardize auto analysis preparation, implementation and verification process Auto analysis process may be improved 16

17 Project Tasks Phase 2 ( ) Review industry experience with automated analysis and identify main reasons and scenarios that can lead to important indications not being reported Identify and analyze other processes and functions, in addition to the detection algorithms, that may require redundancy to reduce the potential for unknown errors in the process Identify a practical means to improve automated data analysis system performance Assemble examples from industry experience which illustrate the challenges for automated data analysis systems Initiate the work on generating a representative dataset suitable to demonstrate challenges and unusual situations 17

18 Project Tasks Phase 3 ( ) Complete work on an assessment of processes for implementation of automated data analysis systems Assemble a comprehensive library of data that can be used for configuration development, verification and testing Provide a practical means to verify expected performance of the automated system Provide guidance and instructions in the form of a check-list that can be used when verifying automated system configuration and assessing the performance Note: This will be in addition to the standard performance demonstration tools (i.e. AAPDD, SSPD) 18

19 Planned Deliverables Technical Report on Phase 1 results Planned title Assessment of Processes for Implementation of Auto Data Analysis Systems Expected date of issuance: September 2016 Technical Report on Phase 2 & 3 results, including a database of eddy current test data for developing, configuring, verifying and testing automated systems 19

20 Together Shaping the Future of Electricity 20