Reactor core comprising small diameter fuel, channels rather that one large pressure vessel.

Size: px
Start display at page:

Download "Reactor core comprising small diameter fuel, channels rather that one large pressure vessel."

Transcription

1 1

2 Reactor core comprising small diameter fuel, channels rather that one large pressure vessel. Allows on-power refueling - extremely high capability factors are possible. The moveable fuel bundles in the pressure tubes allow maximum burn-up of all the fuel in the reactor core. Extends life expectancy of the reactor because major core components like fuel channels are accessible for repairs when needed. 2

3 3

4 4

5 5

6 6

7 7

8 8

9 9

10 1) Program Specific Governance / Procedures / Tools GREEN: Governance and governance support documents reviews consistently conducted well in advance of review cycle. All external requirements are included in procedures. Procedures are aligned with program governance. Peer Team, program owner or SPOC actively participate in regulatory industry committees responsible for setting these requirements. No gaps in governance identified by external review, benchmarking or assessment activities over the past two years. Self-assessment of compliance to governance and external requirements performed in the last three years. WHITE: No overdue governance and governance support document reviews based on review cycle. All external requirements (i.e. bases) are established and maintained in the supporting documents. Self-assessment of compliance performed in the last three years. YELLOW: Up to 10% of governance and governance support document reviews are overdue. Not all external requirements (i.e., bases) have been established or are not currently maintained in the program and supporting documents; however, plans are in place to correct deficiencies and are on track for completion. RED: Over 10% of governance and governance support document reviews are overdue or external requirements (i.e., bases) are not established and maintained in program and supporting documents. 2)Operating Experience (OPEX) and Internal / External Lessons Review GREEN: Internal / external OPEX reviewed and incorporated into the program in a sustained manner for last 24 months reporting period. Assessment completed to confirm effective implementation. No gaps in effective use of OPEX and lessons learned identified by external reviews, benchmarking or assessment activities over the past two years. WHITE: Internal / external OPEX and lessons learned reviewed via CAQ process and incorporated into Program. Specific examples identified. YELLOW: Internal / external OPEX reviewed, but not fully effectively incorporated or awaiting implementation into Program. RED: Internal / external OPEX and lessons learned not reviewed or effectively incorporated into Program 3)Benchmarking GREEN: Internal / external OPEX reviewed and incorporated into the program in a sustained manner for last 24 months reporting period. Assessment completed to confirm effective implementation. No gaps in effective use of OPEX and lessons learned identified by external reviews, benchmarking or assessment activities over the past two years. WHITE: Internal / external OPEX and lessons learned reviewed via CAQ process and incorporated into Program. Specific examples identified. YELLOW: Internal / external OPEX reviewed, but not fully effectively incorporated or awaiting implementation into Program. RED: Internal / external OPEX and lessons learned not reviewed or effectively incorporated into Program Benchmarking 10

11 4) Peer Team Status GREEN: Program Owner, SPOC or peer team member actively participates in program industry working groups. Program area benchmarked in past 24 months. Program includes a defined set of indicators consistent and/or aligned with industry best standards. No gaps in effective use of benchmarking identified by external reviews, benchmarking or assessment activities over the past two years. WHITE: Program industry Benchmarking completed in past 24 months. Recommendations fully implemented via self-assessment and/or CAQ. Actions taken deemed effective. Program includes a defined set of indicators consistent and/or aligned with industry standard as determined via benchmarking. YELLOW: Program industry benchmarking completed in past 24 months. Recommendations awaiting implementation, or inprogress via self-assessment and/or CAQ. Actions not yet deemed effective. Program includes a defined set of indicators, but not aligned with industry RED: Program industry benchmarking not completed during the last 24 months. A controlled set of performance indicators aligned with industry standards does not exist for the program 5) Change M anagement Plan GREEN: Program change effectively planned, managed and implemented as per Standard. Post-effectiveness review completed and adverse conditions identified for ineffective implementation. No gaps in program change management identified by external reviews, benchmarking or assessment activities over past two years. WHITE: Program change planned, managed and implemented as per Standard (i.e. vision documented, documents completed, change reviewed and approved by site change management committees). Post effectiveness review completed and no adverse conditions identified for ineffective implementation. YELLOW: Program change planned and in the process of being implemented via Standard. Or, Vision documented in accordance with Standard however no communication plan developed at time of this report. Post effectiveness review not yet completed. RED: Significant program change implemented without use of Standard. Adverse conditions resulting from the lack of change management evident. 6) Significant Level 1 and 2 Conditions Adverse to Quality GREEN: Program free of significance level 1 and 2 CAQs initiated over the last 24 month reporting period. All previous Recurrence Control (RC) actions complete, and deemed effective through EOER. No recurrence of previous adverse conditions in last 24 month reporting period. WHITE: No program or performance related Level 1 or 2 CAQs initiated in the last 12 months. All previous RC actions complete, and deemed effective through EOER. No RC action due date extensions. YELLOW: One (1) program or performance related Level 1 or 2 CAQ initiated in the last 12 months. RC actions complete or inprogress, and on track for timely / quality completion. Effectiveness review EOER outstanding. RC action due date extensions approved by EO manager. RED: 2 program or performance related Level 1 or 2 CAQs initiated in the last 12 months; RC actions in-progress, delayed or not started. Or, one (1) Level 1 or 2 CAQ initiated in last 12 months; actions not on track or deemed ineffective. RC action due date extensions approved by EO manager 11

12 7) Internal / External Audits and Areas For Improvements GREEN: During last two internal/external audits/inspections conducted the program is free of findings. Improvement recommendations have been assessed and dispositioned or implemented and deemed effective. WHITE: Most current internal/external audit/inspections conducted rated as WHITE. Actions complete, or on-track for completion, with no due date extensions. Or, audit/inspection rated YELLOW, all actions complete and deemed effective through Evaluating Organization Effectiveness Review (EOER) self-assessment. Program performance results meeting expected target. YELLOW: Most current audit/inspections conducted rates as YELLOW. Actions in progress, with due date extensions or at risk of being ineffective. Or, current audit rated RED, all actions complete and must be deemed effective through EOER selfassessment. Program performance results improving and approaching expected target. RED: Current audit / inspection performance rating RED or escalated. No action plan in progress or action plan ineffective in closing gap. Findings not effectively being addressed and resolved. 8) Observation and Coaching GREEN: O&C (Observation and Coaching) has been completed, analyzed, and actions to improve performance have been implemented and performance improvement has been identified in self-assessments or performance indicator results. Actions taken deemed fully effective. No gaps in O&C effectiveness identified by external reviews, benchmarking or assessment activities over the past two years. WHITE: O&Cs have been conducted, analyzed, and actions to improve performance are completed or on track for completion, with no due date extensions. Program results as expected. YELLOW: O&Cs have been conducted and analyzed. Actions to improve performance are either completed, or on track, yet program results are not as expected. RED: No O&Cs conducted or analyzed on the Program during the last 12 months. 9) Self-Assessment GREEN: High quality self-assessments of the program have been completed per approved self-assessment schedule; incorporating a cross functional team, and external industry peer. Recommended actions implemented and deemed effective. WHITE: Self-assessments completed with appropriate team composition in a quality manner, as per schedule. Recommended actions implemented and deemed effective. YELLOW: Self-assessments completed with appropriate team composition in a timely manner. Actions not all complete and no initial improvement evident. RED: Self-assessments not completed. No established self-assessment cycle for Program OR actions to improve performance were deemed ineffective (as determined via follow-up self-assessment or CAQ). 12

13 10) Conditions Adverse to Quality (CAQ or SCR) Trending GREEN: CAQ trending has been completed in a quality and timely manner to proactively identify potential emerging program trends to drive performance improvement in the execution of the program. Cognitive trends analyzed, actions implemented and deemed fully effective. Program trending recognized as industry-best practice as evidenced by external reviewers, benchmarking and assessment activities. WHITE: CAQ trending has been completed on-time, no due date extensions. Actions to improve performance are complete or CAQ trending has been completed with no adverse / emerging trends identified. YELLOW: CAQ trending not completed on-time, approved extension to due date. Actions not developed or all completed and no initial improvement evident. RED: No established CAQ trending schedule for the Program or actions to improve performance were deemed ineffective (as determined by follow-up self-assessment or CAQ). 11) Qualified Personnel GREEN: The Program execution organization is fully staffed with the correct staff and competencies, succession plans and replacement strategies are in place and fully effective. No gaps in program execution organization elements identified by external reviews, benchmarking or assessment activities over the past two years. WHITE: The Program execution organization is fully staffed with the correct staff and competencies. Succession plans and replacement strategies are in place and effectively used to staff the organization appropriately. Program is being maintained at expected level. YELLOW: The Program execution ownership organization is fully staffed, but with sufficiently junior staff such that competency levels are at risk. RED: The Program execution organization does not meet minimum staffing or competency requirements, and utilizing prolonged use of overtime to manage workload. 12) Training Program GREEN: Program training and qualification requirements are of high quality, and fully-systematic Approach to Training (SAT) based. Several examples over last 24 month reporting period where Training has improved performance as evidenced by selfassessment or step-change improvement in performance indicator results. Program training and qualification requirements are recognized as industry-best practices as evidenced by external reviewers, benchmarking and assessment activities. WHITE: Training and qualification requirements are in place. SAT Based with up-to-date Job and Task Analysis revalidated in past 24 months. Training has improved performance as evidenced by self-assessment and program results as expected. Actions to sustain performance deemed effective. YELLOW: Training and qualification requirements are in place. Program not fully-sat Based. Job and Task Analysis requirements have not been revalidated in past 24 months. Actions to improve performance, as a result of training feedback or self-assessment, are in progress, not yet deemed effective. RED: There are no training and/or qualification requirements in place for the Program or training is not SAT based. There are currently no actions in place to improve performance through training. Several gaps to excellence known not addressed. 13

14 14

15 15

16 DNGS Q4 PHR PDF File 16

17 17

18 18

19 Fleet View Part A PDF File Fleetview Part B and C Microsoft Word Document Exec Summary Microsoft Word Document 19

20 20

21 21

22 22

23 The Program Personal section has the KPI of Owner Proficiency where the EQ Program Owner s experience with the program is measured. To rate Excellent, the Program owner must have at least 3 years (or two outages) experience with the EQ Program. Marginal, is less than one-year experience. The EQ Program owner is expected (and measured here) to have the EQ Program Qualifications (training) met. OPG does not measure this or require the EQ Program owner to possess any of the four (4) EQ Program Qualifications The Program Personal section also measures Bench Strength where backup staff (for the primary EQ Engineering staff and succession management) is measured by years of EQ experience and the obtainment of the EQ Training Qualifications, OPG does not measure this, however a measure of staffing is currently done as an EQ Health Report KPI. This measure is not specifically for a backup but for current staff engaged primarily in EQ activities. As to a measure of who is in the cue / succession, OPG EQ Program for additional Bench Strength is not known. The last 3 years of the OPG KPI in the Program Health Report for EQ sustaining staffing has remained Yellow and is not measuring backup staff strength. The Program Personal section also measures Peer Interaction, where participation within the utilities EQ Working Group meetings, Other Program groups interfaces (valve group, procurement / supply chain, training committees utility QA audits of manufacturers, system engineers, etc.) are measured. OPG EQ Health Report does measure Benchmarking however a numerical value or scope is not defined, only an interval of at least one (O& C) Benchmark in the last twenty four (24) months. Internal benchmarking only is acceptable for meeting the OPG EQ KPI and no requirement to interface with specific peers / programs is known. The Program Personal section also measures Industry Interaction, where outside the utility participation (as opposed to Peer Interactions above) in formal benchmarking at other utilities, inclusion and participation with EQ Industry groups (e.g., Nuclear Utility Group on Equipment Qualification, IEEE and Nuclear Industry Standards, INPO committees, etc.) is measured. Again, OPG EQ Health Report does measure Benchmarking, however, the criteria of who and what outside the utility was interfaced is not part of the acceptance criteria. 23

24 The KPI concerning Program Documentation, to obtain an Excellence rating, no outstanding changes for program documents (e.g. DCR, required revision, late reviews, etc.) greater than one fuel cycle (18 months) are outstanding. Two (2) to five (5) backlog changes with in one fuel cycle rates Marginal Performance. The EQ Document backlog at OPG remains Yellow with hundreds of outstand DCRs greater than 18 months in age. Includes Specific EQ Program procedures, primary ownership. To a lesser extent it includes vigilance in maintaining awareness of procedures not governed by the EQ Program, but govern other programs and processes interacting with the EQ Program; or providing inputs to, or outputs from, the EQ Program. The KPI concerning Program Procedures Includes Specific EQ Program procedures, primary ownership. To a lesser extent it includes vigilance in maintaining awareness of procedures not governed by the EQ Program, but that govern other people and processes interacting with the EQ Program; or providing inputs to, or outputs from, the EQ Program. In the Program Infrastructure section, the KPI of Corrective Action extension (re-scheduling) of due dates is measured. Corrective action completion timeliness is also measured, e.g., how long it takes to resolve and close an EQ condition adverse to quality. To achieve Excellent rating only one (1) extension fleet wide is allowed, three (3) extensions within a year merit Marginal performance. The OPG EQ Program Health Report does address a quantitative value for SCRs for Level 1 and Level 2 and any SCR trend identified. The health report does not measure as the EB KPI provides, i.e., how many time a corrective action is extended and how long it is taking to close Corrective Actions. 24

25 External Stakeholder Findings is a heavily weighted KPI in the program performance criteria of the NEI benchmark reports. For example to achieve an Excellent rating; no findings in the last two years. External findings include regulator findings (e.g., CNSC cited or non-cited violation), INPO AFIs, INPO / WANO Review Visit, actionable recommendations identified through NEI publications, outside O In the Program Implementation section, the Clock Resets Attributable to EQ, deliverables include items such as work request / work order, location sketches, evaluations, walk-downs, etc. Peer Reviewed and Optimized indicates that the plan has been checked by another EQ Engineer, approved by management and submitted / reviewed by a challenge board or similar. In the Program Implementation section, the KPI of Preventative Maintenance (PM) deferrals are measured. To obtain Excellence rating No deferred PMs are required. To obtain Acceptable rating 1 % (less than or equal to one percent) late PMs are required. In the Program Implementation section Work Management, adherence to work schedule, outage readiness, outage completion milestones, material and parts availability, work order documentation, end of outage walkdowns and field inspections with system engineering are addressed. 25

26 In the Program Infrastructure section, the KPI of Long Range Strategy is measured. This includes indicators for items requiring significant resources, infrastructure updates, projects to address program needs including obsolesce studies, training upgrades, outside consultant and benchmarking expenses (for other visiting utilities and making benchmarking visits) and overall budget needs including staff augmentation and in house staff travel and training. The KPI requires a five (5) year plan, updated in the last year to attain Excellent rating. This program attribute is not presently measured in the OPG EQ Program Health Reports. In the Program Infrastructure section, the KPI of Self Assessment & Benchmarking is measured. Criteria is self explanatory. OPG EQ Program maintains a 3 year Self Assessment schedule encompassing areas such as: Training Catalog & Inventory procurement Shelf Life Program Maintenance Procedures Bills Of Materials Engineering Change Control EQ Design Basis Alignment Cable Surveillance Room Conditions Environmental Monitoring EQ List and Assessments Program Health Reporting The KPI concerning Operating Experience (OPEX), all OPEX items require closure in sixty (60) days to obtain Excellence rating. To obtain Marginal rating, no OPEX issues outstanding greater than ninety (90) days. The OPG EQ Program Health Reports does address OPEX incidents, but the age of the completion of the resolution is not tracked, only the quantity of OPEX. 26

27 The Program Personal section has the KPI of Owner Proficiency where the EQ Program Owner s experience with the program is measured. To rate Excellent, the Program owner must have at least 3 years (or two outages) experience with the EQ Program. Marginal, is less than one-year experience. The EQ Program owner is expected (and measured here) to have the EQ Program Qualifications (training) met. OPG does not measure this or require the EQ Program owner to possess any of the four (4) EQ Program Qualifications. The Program Personal section also measures Bench Strength where backup staff (for the primary EQ Engineering staff and succession management) is measured by years of EQ experience and the obtainment of the EQ Training Qualifications, OPG does not measure this, however a measure of staffing is currently done as an EQ Health Report KPI. This measure is not specifically for a backup but for current staff engaged primarily in EQ activities. As to a measure of who is in the cue / succession, OPG EQ Program for additional Bench Strength is not known. The last 3 years of the OPG KPI in the Program Health Report for EQ sustaining staffing has remained Yellow and is not measuring backup staff strength. The Program Personal section also measures Peer Interaction, where participation within the utilities EQ Working Group meetings, Other Program groups interfaces (valve group, procurement / supply chain, training committees utility QA audits of manufacturers, system engineers, etc.) are measured. OPG EQ Health Report does measure Benchmarking however a numerical value or scope is not defined, only an interval of at least one (O& C) Benchmark in the last twenty four (24) months. Internal benchmarking only is acceptable for meeting the OPG EQ KPI and no requirement to interface with specific peers / programs is known. In the Program Infrastructure section, the KPI of Corrective Action extension (re-scheduling) of due dates is measured. Corrective action completion timeliness is also measured, e.g., how long it takes to resolve and close an EQ condition adverse to quality. To achieve Excellent rating only one (1) extension fleet wide is allowed, three (3) extensions within a year merit Marginal performance. The OPG EQ Program Health Report does address a quantitative value for SCRs for Level 1 and Level 27

28 2 and any SCR trend identified. The health report does not measure as the EB KPI provides, i.e., how many time a corrective action is extended and how long it is taking to close Corrective Actions. 27

29 28