ERO Effectiveness and Stakeholder Perceptions Survey

Size: px
Start display at page:

Download "ERO Effectiveness and Stakeholder Perceptions Survey"

Transcription

1 Agenda Item 14.h.ii ERO Effectiveness and Stakeholder Perceptions Survey Report to the NERC Board of Trustees by the Compliance and Certification Committee April of Peachtree Road NE Suite 600, North Tower Atlanta, GA

2 Table of Contents Table of Contents... 2 Executive Summary Recommendations... 4 Introduction... 5 Survey Composition and Response Rates/Demographics... 6 Rating Scale... 6 Survey Items... 6 Response Rate and Demographics... 8 Summary of Survey Responses Topic Area Analyses Highest and Lowest rated Items Topic Analyses by Region Favorability Analyses Regional Favorability Analysis Year over year Analyses Comment Analysis Compliance Monitoring Enforcement Actions Organization Registration Organization Certification Compliance and Reliability Impact Recommendations: ERO CMEP Recommendations: Regional Entities Inconsistencies across Regions Conclusion and Recommendations Conclusion Recommendations of 43

3 Executive Summary In the third quarter of 2013, consistent with its 2013 Annual Plan, the NERC Compliance and Certification Committee (CCC) conducted its fifth annual survey measuring stakeholder perceptions on the policies, practices and effectiveness of NERC and Regional Entities with respect to the Compliance program, Registration program, and Certification program. This periodic review is an essential element of the Board of Trustees (Board) and Federal Energy Regulatory Commission (FERC) endorsed CCC Charter. Feb_2010.pdf) The CCC survey is intended to be a source of industry feedback to the NERC Board on NERC s and the Regions effectiveness and performance. To execute this year s survey, the CCC partnered with third party provider TalentQuest, a talent management software and human capital management consulting firm, to drive survey enhancements. The CCC also reformulated survey content to create a more discerning, rigorous and valid evaluation, captured demographic data (3 year vs. 6 year audit cycle and registered function), examined responses across Regions, employed favorability analyses to interpret response polarity, conducted a year to year comparison of perception to identify trends, and incorporated thematic analysis of written responses. The NERC Board is encouraged to provide feedback and insight to the CCC that would be helpful for future enhancements to this survey. The 2013 survey employed both a qualitative and quantitative approach to measure perceptions of industry stakeholders. Perceptions were measured across 6 Topic Areas including: Compliance Monitoring, Enforcement Actions, Organization Registration, Organization Certification, Compliance and Reliability Impact, and Other (Appendix A). With the objective of more robust analysis and greater interpretability, a CCC project committee identified, developed and validated a set of 26 items that measured the constructs of the 6 Topic Areas. The CCC also updated the five rating scale descriptors, transitioning to an agreement driven scale of Strongly Disagree to Strongly Agree. This rating scale captured the degree to which respondents perceived NERC and the Region performance against the referenced program aspect. Survey participants included all individuals listed on the NERC Primary Compliance Contact (PCC) Roster. For those respondents representing entities in multiple Regions, the 2013 survey form contained a section on each Region represented. NERC and the Regions were evaluated separately in all Topic Areas except Compliance and Reliability Impact and Other where perceptions were measured in the aggregate. The results provide a range of views from all sectors represented on the CCC and all registered entity functions. The overall responses and comments were insightful in delineating stakeholders relative perceptions of program strengths and opportunity areas. Industry feedback of strengths included: Compliance Monitoring components involving audit teams, audit processes, and self certification. Positive progress by NERC and the Regions across all Topic Areas between 2012 and 2013 is reflected by increases in perceptions ranging from.05 to.29. Industry feedback for improvement included the themes: Increased consistency and standardization (in processes across Regions, in penalty application and the settlement process, and in communications from NERC and Regional Entities). Communication of clearer, more applicable definitions (in the recertification processes, in Joint Registration Organization (JRO) and Coordinated Functional Registration (CFR) processes, in the BES, and in valid reliability measure). 3 of 43

4 Executive Summary Greater efficiencies (via increased process timeliness and feedback, increased return on investment of compliance program, continued streamlining of standards, and increased transparency). In addition, following the preliminary presentation of survey results, including the thematic analysis of all comments, NERC leadership requested additional analysis to uncover more specificity and clarification around any comments that may suggest inappropriate or unprofessional conduct Recommendations Through analysis and interpretation of the 2013 Stakeholder Perception Survey, the CCC has decided to recommend that NERC continue to move forward with the 2012 recommendations which are tied to the three themes that have prevailed over the past three years the survey has been conducted: 1. Return on Investment (ROI) of compliance program; 2. inconsistencies within and across Regions; and 3. transparency of the enforcement and penalty processes In this context, ROI means reliability benefits versus resources expended on compliance. In addition to the recommendations below, NERC should address the specific conduct concerns that were presented in some of the comments. These concerns have been presented to NERC under separate cover: 1. The CCC should continue to work with NERC staff to communicate the survey results in a productive way to NERC and the Regions to facilitate performance improvement and positively impact stakeholder perceptions. 2. The recommendations provided in the 2011 Stakeholder Perceptions Survey specifically included items to address ROI (recommendation number one) and transparency (recommendation number four). NERC should compile a new list of action items based on this 2012 survey and those provided in the 2011 survey. The CCC should have a standing item on its agenda for the next year to review NERC s progress on its action plan, offer assistance where requested, and work with NERC to communicate its successes to improve the results of future surveys. 3. NERC should review key/identified post audit processes (audit reports, mitigation plans and settlements) with the objective of standardizing the process across Regions, streamlining components, and reducing the total process time. 4. NERC staff should review the free form written responses to the survey to identify the predominant consistency issues and develop an action plan to address them. NERC should implement a Multi Region Registered Entity (MRRE) process by the end of 2014 (this was also addressed in the 2011 survey recommendations). The CCC thanks the industry for their participation and the feedback provided in the survey. The CCC also wants to recognize NERC s support in facilitation of the CCC Stakeholder Survey, and its provision of an outside consultant to drive survey enhancement. The CCC requests that the Board consider releasing a public version of the report. 4 of 43

5 Introduction The CCC Electric Reliability Organization Monitoring Subcommittee (EROMS) conducted a survey of stakeholder perceptions in accordance with the NERC CCC Charter. 1 Primary Compliance Contacts in the NERC database (1201 PCCs) were polled to measure stakeholder perceptions of NERC and Regional Entity effectiveness in implementing and executing components of the continent wide Compliance, Registration, and Certification programs since June 18, The 26 item survey measuring the constructs of the 6 Topic Areas was transmitted to the NERC PCC Roster in order to solicit response from the broad stakeholder spectrum within registered entities. The CCC received a 29.3 percent response rate to the 2013 survey, a 3 percent increase compared to The Topic Areas evaluated in the 2013 survey included the following: Compliance Monitoring Enforcement Actions Organization Registration Organization Certification Compliance and Reliability Impact Other To facilitate greater interpretability and more robust analysis, a CCC project committee identified, developed and validated the set of 26 items measuring the constructs of the 6 Topic Areas (Appendix A). These items were evaluated via a five point, agreement based rating scale ( Strongly Disagree to Strongly Agree ). Stakeholders were asked to evaluate NERC and the Regions separately on the first four Topic Areas (Compliance Monitoring, Enforcement Actions, Organization Registration, and Organization Certification), and for a combined NERC/Region evaluation in the Compliance and Reliability Impact and Other Topic Areas. Qualitative data was gathered at the Topic Areas and through three open ended questions. The qualitative data was interpreted through a thematic analysis to quantify open ended perceptions. Demographics gathered included Region, Registered Function, and Audit Cycle. In addition to Regional topic and item analyses, the survey reported item favorability (item response distribution), and year over year trend analyses. 1 Section 2.2.a: Provides comments to NERC with respect to stakeholder s perception of the policies, practices and effectiveness of the Compliance program, Registration program, and Certification program. 5 of 43

6 Survey Composition and Response Rates/Demographics Rating Scale 1 Strongly Disagree 2 Disagree 3 Neither Disagree nor Agree 4 Agree 5 Strongly Agree Survey Items Compliance Monitoring [NERC, Regions evaluated separately] 1. Our most recent audit process (including pre and post audit activities) was conducted in a timely manner. 2. The audit team conducts themselves in a professional and credible manner. 3. The audit team is prepared and organized to conduct the audit process. 4. The audit team adheres to established rules and processes. 5. The spot check process is well defined, organized and timely. 6. The self reporting process is well defined, organized and timely. 7. The self certification process is well defined, organized and timely. Please provide any recommendations regarding Compliance Monitoring: Enforcement Actions [NERC, Regions evaluated separately] 8. The audit report identifies clear, definitive, and actionable items to address. 9. All communication of violations clearly and specifically describe the manner in which a requirement was violated. 10. All communication of violations clearly and specifically describe the risk of the PV both actual and potential. 11. The mitigation plan submission and approval process is efficient and effective. 12. The settlement process is transparent, consistently applied, and clearly communicated. 13. The penalty process and penalties are transparent, consistently applied, and clearly communicated. Please provide any recommendations regarding Enforcement Actions: Organization Registration [NERC, Regions evaluated separately] 14. The functional model is clear and minimizes overlaps and gaps. 15. The Organization Registration process minimizes overlaps and gaps. 16. The Organization Registration process is well defined and efficient. 17. The rules for assigning registered functions to entities are clear and accurately applied. 18. The Joint Registration Organization (JRO) and Coordinated Functional Registration (CFR) processes are clear and consistently implemented. 6 of 43

7 Survey Composition and Response Rates/Demographics 19. Inquiries about the registration process are responded to in a timely and effective manner. 20. There is a timely and effective response during the Registry Appeals Process. Please provide any recommendations regarding Organization Registration: Organization Certification [NERC, Regions evaluated separately] 21. Organization Certification rules and procedures are clear and consistent Please provide any recommendations regarding Organization Certification: Compliance and Reliability Impact 22. NERC Compliance programs and corresponding industry efforts have enhanced reliability of the bulk power system. 23. Regional Entity Compliance programs and corresponding industry efforts have enhanced reliability of the bulk power system. 24. The Standards Driven Index (SDI) which includes the Compliance Index (CI) coupled with the Adequate Level of Reliability (ALR) demonstrates enhanced reliability for the industry. 25. The reliability benefits experienced are worth the degree of compliance effort (time and resources) our entity has expended. Please provide any recommendations regarding Compliance and Reliability Impact: Other 26. The Compliance Monitoring and Enforcement processes are applied consistently across NERC and the Regions. If applicable, please provide specific examples of inconsistencies across Regions and/or NERC Please provide any recommendations regarding the ERO CMEP: Please provide any recommendations regarding the Regional Entities: 7 of 43

8 Survey Composition and Response Rates/Demographics Response Rate and Demographics There were 1201 PCCs invited to respond to the 2013 survey. There were 352 individual survey respondents (29.3 percent response rate). PCCs representing entities in multiple Regions completed the survey for each applicable Region resulting in more Regional data points than individual respondents. The response rating increased from 26.4 percent to 29.3 percent over 2012 despite a 78 decrease in number of respondents. Regional Responses (frequencies) by Registered Function While the Registered Function demographic was not utilized in response analysis, the data more fully describes the respondent sample. BA DP GO GOP IA LSE PA PSE RC RP RSG TO TOP TP TSP FRCC MRO NPCC RFC SERC SPP TRE WECC TOTAL of 43

9 Survey Composition and Response Rates/Demographics Regional Responses (frequencies) by Audit Cycle While Audit Cycle demographics were not utilized in response analysis, the data more fully describes the respondent sample. 3 Year Audit Cycle 6 Year Audit Cycle FRCC MRO NPCC RFC SERC SPP TRE WECC TOTAL of 43

10 Summary of Survey Responses Topic Area Analyses The first level of analysis was conducted at the Topic Area level and considered all survey responses in aggregate (NERC and Regions combined). Compliance Monitoring was the highest rating Topic Area (mean = 4.10), with all five of the survey s highest rated items emerging from Compliance Monitoring. The lowest rated Topic Area, Other (mean 2.93) was comprised of a single item, The Compliance Monitoring and Enforcement processes are applied consistently across NERC and the Regions. The four remaining lowest rated items were distributed across the Topic Areas of Compliance and Reliability Impact, Organization Registration, and Enforcement Actions. Topic Area Means 10 of 43

11 Highest and Lowest-rated Items All five of the survey s highest rated items were in the Compliance Monitoring Topic Area with means of 4.08 and above. The lowest rated item means ranged from 2.56 to 3.29 and were distributed across Topic Areas of Other, Compliance and Reliability Impact, Organization Registration, and Enforcement Actions. All item means can be found in Appendix A. Highest-Rated Items 11 of 43

12 Lowest-Rated Items 12 of 43

13 Topic Analyses by Region For the 4 Topic Areas where ratings were gathered for NERC and the Regions, analysis was conducted for both the Regions and NERC. Those Topic Areas were: Compliance Monitoring, Enforcement Actions, Organization Registration and Organization Certification. The following analyses depict differences across the individual Regions, the combined Regions, and NERC. T tests and ANOVA were used to identify statistically significant differences. In this analysis, the calculated, Overall Average of all items (the 21 items contained in these Topic Areas), identified a slight difference between NERC (3.70) and the Regions (3.74). Individual Region Overall means ranged from 3.40 to The 4 Topic Area analyses reflected little variation between the Regions (combined) and NERC, with differences ranging only from.01 to.16. Conversely, means between Regions varied both in the ranking order of Regions across Topic Areas, and in within Topic differences (ranging up to.79). Overall Averages Between the Regions, a.62 difference exists between the highest mean (4.02, NPCC) and the lowest mean (3.40, SPP), with statistically significant differences found between SPP and NPCC (p <.05). 13 of 43

14 Compliance Monitoring Individual Region averages ranged from 3.61 (SPP) to 4.36 (NPCC) within Compliance Monitoring, and significant differences were found between SPP and NPCC, MRO and NPCC, and FRCC and NPCC (all at p <.05). 14 of 43

15 Enforcement Actions Individual Region averages ranged from 3.09 (SPP) to 3.88 (NPCC and RFC) with significant differences found in SPP and NPCC, and SPP and RFC (p<.05). 15 of 43

16 Organization Registration Individual Region averages ranged from 3.21 (WECC) to 3.82 (NPCC) with a significant difference present between WECC and NPCC (p <.05). 16 of 43

17 Organization Certification Individual Region averages ranged from 3.44 (WECC) to 3.88 (RFC) with a significant difference present between WECC and RFC (p <.05). 17 of 43

18 Favorability Analyses Favorability analysis allowed for additional insight into the strength of stakeholder perception through examination of the rating distribution. Responses across all Regions and NERC were re coded as Unfavorable (rating of 1 Strongly Disagree or 2 Disagree), Neutral (rating of 3 Neither Disagree Nor Agree), and Favorable (rating of 4 Agree or 5 Strongly Agree). Rank ordering of all items by those Most Favorable, and by those Least Favorable, highlighted items for further interpretation. Across all Regions, the five items with the greatest number of favorable ratings (ranging from percent) were in the Compliance Monitoring Topic Area. Conversely, the 5 items with the greatest number of unfavorable ratings (ranging from percent) were in the Topic Areas of Compliance and Reliability Impact, Other, Enforcement Actions, and Organization Registration. Most Favorable 18 of 43

19 Least Favorable 19 of 43

20 Regional Favorability Analysis For the 5 least favorable items, a Region breakdown (where applicable) identified further variation. The reliability benefits experienced are worth the degree of compliance effort (time and resources) our entity has expended: For this item, the average overall unfavorability is 49 percent. 20 of 43

21 The rules for assigning registered functions to entities are clear and accurately applied: For this item, Regional comparisons identify WECC s 27 percent unfavorability the highest and NPCC s 0 percent unfavorability the lowest. Average overall unfavorability is percent. 21 of 43

22 The Organization Registration process is well defined and efficient: For this item, Regional comparisons identify WECC s 22 percent unfavorability the highest and NPCC s 0 percent unfavorability the lowest. Average overall unfavorability is 14 percent. 22 of 43

23 The penalty process and penalties are transparent, consistently applied, and clearly communicated: For this item, Regional comparisons identify SPP s 44 percent unfavorability the highest and NPCC s and RFC s 8 percent unfavorability the lowest. Average overall unfavorability is 22 percent. 23 of 43

24 The Organization Registration process minimizes overlaps and gaps: For this item, Regional comparisons identify WECC s 22 percent unfavorability the highest and NPCC s 0 percent unfavorability the lowest. Average overall unfavorability is 11 percent. 24 of 43

25 Year-over-year Analyses New rating scale descriptors and extensive survey item updates in 2012 have previously constrained year overyear analyses. However, 2013 survey items are an exact match from 2012 survey items, allowing a comparative year over year exploration of stakeholder perception. Response Demographics: Region Overall Averages by Topic Area Consistent with 2012, Compliance Monitoring remains the highest rated Topic Area (4.10), increasing by.05. Other remains the lowest rated Topic Area (2.93), increasing by.29 as compared to Year Over Year analysis did not yield any statistically significant differences between topic areas (p>.05). 25 of 43

26 Compliance Monitoring *Differences are not statistically significant (p>.05) Summary of Survey Responses Enforcement Actions *Differences are not statistically significant (p>.05) 26 of 43

27 Organization Registration *Differences are not statistically significant (p>.05) Summary of Survey Responses Organization Certification *Differences are not statistically significant (p>.05) 27 of 43

28 Compliance and Reliability Impact *Differences are not statistically significant (p>.05) Summary of Survey Responses Other *Difference in means are not statistically significant (p>.05) 28 of 43

29 Overall Averages by Region *Statistically significant difference (p<.05) At the Regional level, 4 of the 8 Regions (NPCC, RFC, SERC, and TRE) averaged higher ratings overall as compared to 2012 with differences ranging from.01 to.33. Similarly, NERC as compared to the Regions, received.08 higher ratings than reflected in However, the overall Regional average remains consistent, reflecting only a slight increase of.01 when compared to 2012 ratings. The RFC and TRE Regions both show statistically significant differences in their year over year overall response averages (p<.05). 29 of 43

30 Compliance Monitoring *Statistically significant difference (p<.05) Enforcement Actions *Statistically significant difference (p<.05) 30 of 43

31 Organization Registration *Statistically significant difference (p<.05) Organization Certification *Statistically significant difference (p<.05) 31 of 43

32 Comment Analysis In the 2013 survey, open ended responses underwent thematic qualitative analysis, creating objective output for interpretation. For each Topic Area and for the directed open ended questions, all comments were coded thematically. Stakeholder feedback focused on the following themes: Increased Consistency and Standardization: In processes across Regions (Audit, Self Reporting and Certification, Mitigation, the Functional Model, etc.) In penalty application and the settlement process In communications from NERC and RE s (portals, web links, etc.) Communication of Clearer, More Applicable Definitions: In recertification processes In JRO and CFR processes In the BES In valid reliability measure Greater Efficiencies via: Increased process timeliness and feedback Increased return on investment of compliance program Continued streamlining of standards Increased transparency 32 of 43

33 Compliance Monitoring In both NERC and the Regions, leading themes in Compliance Monitoring referenced Auditor Inconsistencies and Self Reporting/Certification. THEME NERC RE THEME NERC RE *Audit Inconsistencies *Self Reporting/Certification Positive and Professional 2 11 Untimely Audits 2 9 Audit Feedback 2 4 Inappropriate and Unprofessional 1 3 Petty Fines 2 3 Standard Inconsistencies 2 3 Compliance Monitoring 2 Portal Consistency 2 2 Process Inefficacies 2 Accountability 1 Limited to Regional Entity 1 1 Standards Development Process 1 1 Unjustified Costs 1 1 Contribution Clarity WECC 1 Evidence Statements 1 Increased Transparency and Consistency 2 NERC Audit Guidance 1 Auditor Inconsistencies Sample Comments: Overall our audit experiences have been favorable; however, in the past we have found inconsistencies, even within the same Region from auditor to auditor. We are in hopes that this will be eliminated by the auditor handbook. This question will vary, depending on audit type. The 693 audits are well organized, well planned, and well executed. The CIP audits are not well organized, seemed to lack sufficient planning, and resulted in a somewhat chaotic environment. Also, the ever changing CIP interpretations made compliance challenging. Self Reporting/Certification Sample Comments: The self reporting process should be organized as a learning process for others instead of a punitive process. The self certification process has improved slightly, but needs to focus more on the BES that has a valid potential risk. Recommend consideration of less frequent Self Certifications and Data Submittals for smaller entities. 33 of 43

34 Enforcement Actions In both NERC and the Regions, leading themes in Enforcement Actions referenced Settlement and Penalty Process and Consistency/Standardization. Settlement and Penalty Process Sample Comments: THEME NERC RE *Settlement and Penalty Process *Consistency/Standardization Mitigation 8 17 Transparency 5 16 Communications 4 11 FFT/FFR Implementation 3 1 Untimely Approval 1 7 Feedback 5 Positive and Professional 2 Accountability 1 Audit Report 1 Compliance 1 The settlement and penalty processes are not transparent. When more than one requirement is included in the settlement agreement, the penalty assessment is not defined and the calculation is not clear. The audit report is overloaded with legal boilerplate, which may obscure improvement opportunities. Rules of Procedure are clear on mitigations, but mitigation of the immediate problem vs the underlying causes is variable. Settlements on the surface are transparent and end up being well documented, but never sure how they are going to turn out. Penalties seem to drift between the NERC VSL/VRF Penalty Table, and the FERC Sanction Guidelines; never sure which way an issue is going to lean, but prefer the structured approach of the FERC guidelines. Consistency/Standardization Sample Comments: Audit team findings are not necessarily identical to Enforcement team findings. Enforcement varies from Region to Region. WECC has applied more penalties and stiffer fines than the other Regions. The audit process should be an open book test. It should have clear expectations of compliance. If I don't have clear expectations of compliance, how can the enforcement process be fair and impartial? If I don't understand what I've violated, I can't fix it or prevent reoccurrence. 34 of 43

35 Organization Registration In both NERC and the Regions, leading themes in Organization Registration referenced Definitions/Clarity, Functional Model and Consistency/Standardization. Definitions/Clarity Sample Comments: THEME NERC RE *Definitions/Clarity *Functional Model 9 9 *Consistency/Standardization 10 5 Feedback 3 4 Deregistration 4 4 Clear and Efficient 1 3 Overlaps/Gaps 5 3 Burdensome 5 2 Communications 2 2 GO/GOP/TO/TOP 3 2 Accountability 1 1 Process Improvement 1 1 Applicability 2 BES 2 The Planning Authority, Planning Coordinator function is not well defined. There are gaps in registration and smaller entities are registered but should not be as they should be included in a larger Planning Coordinator Area. The rules for assigning registered functions are not clear with information sprinkled in various documents. It is difficult to discern what is needed. The only saving grace is that inquiries on the process are quickly responded to. Discussion with the Regional Entities is necessary to understand the rules. Functional Model Sample Comments: We believe the functional model is clear; however, it can create some confusion when you begin to apply it to certain standards, particularly the earlier versions of the Standards. The function model was created back in early 2000s. Needs revamping. The original effort was to develop functions that an entity could perform. However the standards cross so many of these functions, confusion occurs within a utility of which function is connected to which requirement. Consistency/Standardization Sample Comments: Registration criteria is not consistently applied in comparing other Regions. Request for transfer of registration function from one entity to another has not been addressed after multiple contacts. Inconsistencies by RE personnel in different Regions. Complicated processes make registered entities look for ways not to use ROP options for ORC. 35 of 43

36 Organization Certification In both NERC and the Regions, the leading theme in Organization Registration referenced Definitions/Clarity. THEME NERC RE *Definitions/Clarity 6 7 Re Certification 2 3 Feedback 2 Deregistration 1 1 Efficient 1 1 BES 1 Consistency/Standardization 1 Functionality 2 Process Improvements 1 Definitions/Clarity Sample Comments: The re certification efforts need to be defined. E.g., what is the criteria that NERC uses to conduct a recertification of an RC, TOP, BA. At a RRO meeting, the Organization Reorganization function gave global examples of when to notify the Organization Reorganization Department of changes applied by the entity. The examples were too broad, not clear and not concise. The group should provide a procedure/program defining for the entity when the Organization Registration Department should be contacted for changes in the entity's function. The Certification process is seemingly unknown even to the Regional Entity. Re certification triggered by the NERC Rules of Procedure has apparently been in place for years, so it is unclear why the Regional Entity has no more understanding of what is entailed than the registered entity seeking re certification. Perhaps some more detailed guidance from NERC would be beneficial. 36 of 43

37 Compliance and Reliability Impact Comments regarding Compliance and Reliability Impact were gathered at the aggregate level and leading themes included Return on Investment and Reliability Standards. THEME # THEME # Return on Investment 38 Reliability Standards 18 Administrative Issues 8 Compliance and Enforcement 7 Small Entities 6 Reliability Assurance Initiative 5 Risk Based vs. Zero Tolerance 3 Reliability Improvements 3 CIP Standards/Auditing 3 Consistency Complications 2 Reliability Metrics 1 Quantitative Results 1 Penalties 1 Independent Reliability Monitor 1 Deregulation 1 Cultural Alignment 1 BPS Reliability 1 BES Reliability Metrics 1 37 of 43

38 Return on Investment Sample Comments: While we agree that reliability of the bulk electric system has been enhanced to some degree, the enormous effort and associated costs that have been expended on things with no incremental benefit to the BES has drained resources from areas that are resource poor to begin with. Agencies are spending much more time on Compliance Documentation and it is taking time from operators actually performing reliable operations of the Bulk Electric System. The effort to ensure adequate documentation is excessive for our entity. The reliability effort is the same but documenting such is a burden. Reliability Standards Sample Comments: The main focus of most conversations around the Reliability Standards now are what are the compliance implications and how will compliance be measured and not what will the reliability benefit be from a particular standard. Our entity has been pushed to some "best practices" that were not previously done, but the overall impact on reliability is negligible at best. While there have been improvements with regard to the reliability benefits experienced overall, there is still much work to be done to ensure that the Standards in effect truly do (or may) have a material impact to the operations of the BES. Those Standards that do not have a material impact to the reliability of the BES should be eliminated. 38 of 43

39 Recommendations: ERO-CMEP Recommendations regarding the ERO CMEP were gathered at the aggregate level and leading themes included Reliability Assurance Initiative and Focus on Reliability/Security. THEME # Reliability Assurance Initiative 9 Focus on Reliability/Security 5 Audit Efficiency 4 Consistency 4 Risk Based vs. Zero Tolerance 4 Compliance Monitoring/Transparency 3 Target Due Date Consistency 2 FFT Processing 1 Focus on Compliance and Enforcement 1 Process Improvement 1 Reliability Standards Program 1 Reporting Efficiency 1 Identify, Assess, Correct 1 Training 1 Reliability Assurance Initiative Sample Comments: Recommend diligent completion of development and implementation of the Reliability Assurance Initiative (RAI), particularly with regard to the impact on smaller entities, since the RAI appears to be headed in the right direction. The RAI should try to maintain the focus on managing/mitigating reliability risk, providing consistency across the N.A. footprint and distinguishing between the reliability standards that maximize reliability assurance from those that are not as critical to that assurance. The concern is with the level of effort around implementation of the RAI, specifically the Internal Controls. The White Paper guidance and the Internal Controls Working Guide appear inconsistent with statements by NERC that the RAI process would be less onerous. A more clear direction needs to be provided. Focus on Reliability/Security Sample Comments: We need to facilitate a culture of reliability/security excellence. Where we all work together to keep this machine and all its parts rolling. The process is too punitive and not necessarily geared toward improving reliability. We can't openly share potential reliability gaps for fear of retribution. Provide better information about cyber security violations in the form of lessons learned around specific compliance issues that will enhance reliability. 39 of 43

40 Recommendations: Regional Entities Recommendations regarding the Regional Entities were gathered at the aggregate level and leading themes included Collaboration/Consistency and Audit Standardization. Collaboration/Consistency Sample Comments: THEME # Collaboration/Consistency 17 Audit Standardization 4 Transparency 1 SPP RE Trustee Effectiveness 1 Reporting Improvements 1 Region Principles 1 RE Understanding 1 RE Praise 1 Focus on Reliability/Security 1 FFT Improvements 1 CDAA Improvements 1 Board Expansion 1 Audit Training 1 There should be more collaboration and consistency between the Regions. The Regions need to be consistent regarding JRO's, audit practices, and compliance requirements relative to documentation required, method of submittal, etc. It would be helpful if the Regions settled on one portal. Recommend continued emphasis on achieving uniformity across Regions under the direction of NERC. Audit Standardization Sample Comments: More standardization in the audit approach would be very welcome. Regional entities should work to standardize expectations, timelines, and interpretations of standards. This will minimize the efforts for entities to prove compliance verses focusing on what will make the system more reliable and safe for all customers of the bulk electric power system. We would like to have a single audit where SERC is the lead to cover us for audits in RFC. Our last audit for RFC consisted of a single requirement that really didn't apply to us. Seemed extremely unnecessary. 40 of 43

41 Inconsistencies across Regions Comments regarding inconsistencies across Regions were gathered at the aggregate level and leading themes included Regional Inconsistencies and Audit Inconsistencies. THEME # Regional Inconsistencies 23 Audit Inconsistencies 15 Standards 4 Compliance Monitoring 3 Portal Inconsistencies 3 Self Certification 3 CMEP Schedules 1 Interpretation Clarity 1 CIP 1 Department Inconsistencies 1 Settlement Process 1 Regional Inconsistencies Sample Comments: Hearing from other entities that operate in other Regions or multiple Regions, they indicate there is a definite inconsistency within the Regions regarding application of NERC Reliability Standards. There continue to be inconsistencies across the Regions regarding the enforcement process. Just look at PRC 005 and VAR 002 violations across the Regions. You will see varying amounts of penalties going back to In defense of the Regions, it is difficult to be consistent with some many variable to consider. Some degree on inconsistency is to be expected, but the gap is too wide today. Audit Inconsistencies Sample Comments: There are significant differences in the way CIP audits are being conducted in the SPP Region versus other Regions. The SPP CIP audits are going above and beyond the purview of the Standards. The SPP CIP auditors are auditing to best practices and not auditing to the letter of the Standards. We have been told during workshops and audits that different auditors may reach different conclusions. That is an admission of inconsistency. While there are many areas of consistencies across the Regions, each audit takes on its own persona based on the makeup of the team and especially if there are NERC and FERC observers. 41 of 43

42 Conclusion and Recommendations Conclusion The 2013 Stakeholder Perception Survey illustrated positive trends in all year over year evaluations, indicating industry s view of continued enhancements to ERO Effectiveness. Industry s relative perceptions of program strengths and opportunity areas were delineated through quantitative and qualitative analysis. Industry feedback of strengths included: Compliance Monitoring components involving audit teams, audit processes, and self certification. Positive progress by NERC and the Regions between 2011 and 2012 across all program areas, with a 38 percent increase in perceptions of NERC around Compliance Monitoring and Enforcement Actions. Industry feedback for improvement included the themes: Increased consistency and standardization (in processes across Regions, in penalty application and the settlement process, and in communications from NERC and REs). Communication of clearer, more applicable definitions (in the recertification processes, in JRO and CFR processes, in the BES, and in valid reliability measure). Greater efficiencies (via increased process timeliness and feedback, increased return on investment of compliance program, continued streamlining of standards, and increased transparency) Recommendations Through analysis and interpretation of the 2013 Stakeholder Perception Survey, the CCC has decided to recommend that NERC continue to move forward with the 2012 recommendations which are tied to the three themes that have prevailed over the past three years the survey has been conducted. 1. ROI of compliance program; 2. Inconsistencies within and across Regions; and 3. Transparency of the enforcement and penalty processes. In addition to recommendations below, NERC should address the specific conduct concerns that were presented in some of the comments. These concerns have been presented to NERC under separate cover. 1. The CCC should continue to work with NERC staff to communicate the survey results in a productive way to NERC and the Regions to facilitate performance improvement and positively impact stakeholder perceptions. 2. The recommendations provided in the 2011 Stakeholder Perceptions Survey specifically included items to address ROI (recommendation number one) and transparency (recommendation number four). NERC should compile a new list of action items based on this 2012 survey and those provided in the 2011 survey. The CCC should have a standing item on its agenda for the next year to review NERC s progress on its action plan, offer assistance where requested, and work with NERC to communicate its successes to improve the results of future surveys. 3. NERC should review key/identified post audit processes (audit reports, mitigation plans and settlements) with the objective of standardizing the process across Regions, streamlining components, and reducing the total process time. 4. NERC staff should review the free form written responses to the survey to identify the predominant consistency issues and develop an action plan to address. 5. NERC should implement a Multi Region Registered Entity (MRRE) process by the end of 2013 (this was also addressed in the 2011 survey recommendations). 42 of 43

43 Conclusion and Recommendations The CCC thanks the industry for their participation and the feedback provided in the survey. The CCC also wants to recognize NERC s support in facilitation of the CCC Stakeholder Survey, and its provision of an outside consultant to drive survey enhancement. The CCC requests that the NERC Board consider releasing a public version of the report. 43 of 43