Energy Efficiency Impact Study

Similar documents
Leveraging Smart Meter Data & Expanding Services BY ELLEN FRANCONI, PH.D., BEMP, MEMBER ASHRAE; DAVID JUMP, PH.D., P.E.

Calculating a Dependable Solar. Generation Curve for. SCE s Preferred Resources Pilot

PREFERRED RESOURCES PILOT

M&V 2.0: A User s Guide

When You Can t Go for the Gold: What is the best way to evaluate a non-rct demand response program?

Is More Always Better? A Comparison of Billing Regression Results Using Monthly, Daily and Hourly AMI Data

Impact Evaluation of National Grid Rhode Island s Custom Refrigeration, Motor and Other Installations

Viridity Energy, Inc. A customer baseline serves as an estimate of how much electricity a customer would

A Smart Approach to Analyzing Smart Meter Data

Impact of AMI on Load Research and Forecasting

How Much Do We Know About Savings Attributable to a Program?

Southern California Edison

AMI Billing Regression Study Final Report. February 23, 2016

ManagingEnergy.com. Overview of ASHRAE Guideline (American Society of Heating, Refrigerating and Air-Conditioning Engineers)

MEASUREMENT AND VERIFICATION GUIDELINES FOR RETROFIT AND NEW CONSTRUCTION PROJECTS

Southern California Edison Preferred Resources Pilot Portfolio Design Report Revision 2

Is the "duck curve" eroding the value of energy efficiency?

Integrated Impact and Market Evaluation of Multiple Residential Programs at Florida Power and Light

Energy savings reporting and uncertainty in Measurement & Verification

Estimating Demand Response Potential for Resource Planning

Using Data Analytics to Bridge the Gap between M&V 1.0 and 2.0 Abstract Summary Results

SCE HOPPs Overview. Stakeholder Webinar, 7/14/16

Itron Analytics. Maximize the Full Value of Your Smart Grid. Image

SCE s Preferred Resources Pilot (PRP) Annual Progress Update

Does SEM Achieve Verifiable Savings? A Summary of Evaluation Results

REPORT. SoCalGas Demand Response: 2017/2018 Winter Load Impact Evaluation. August 14, Prepared for Southern California Gas Company

Supplemental Data for California Smart Thermostat Work Paper

Advanced Energy Community Oak View, Huntington Beach, CA Sustain OC 8 th Annual conference & Expo

Savings Estimation Methods for Energy Efficiency Programs: A Half-Hour Guide

Energy Trust of Oregon New Homes Billing Analysis: Comparison of Modeled vs. Actual Energy Usage

EVALUATION OF ENVIROGRID TECHNOLOGY PERFORMANCE. Final Report August Prepared for: Sacramento Municipal Utility District

Demanding Baselines: Analysis of Alternative Load Estimation Methods for Two Large C&I Demand Response Programs

PG&E Residential New Construction Program Impact Evaluation

NOVEC Project Proposal

2012 Statewide Load Impact Evaluation of California Aggregator Demand Response Programs Volume 1: Ex post and Ex ante Load Impacts

Final Smart Thermostat Program Impact Evaluation. Prepared for San Diego Gas and Electric Company San Diego, California.

2017 Seasonal Savings Evaluation

Incentives. After work is complete:

A Case Study in Energy Modeling of an Energy Efficient Building with Gas Driven Absorption Heat Pump System in equest

A Hybrid Model for Forecasting Electricity Sales

Time to Move On: An Examination of Metering Periods for Small Business Direct Install Participants

ENERGY INITIATIVE ENERGY EFFICIENCY PROGRAM Impact Evaluation of Prescriptive and Custom Lighting Installations. National Grid. Prepared by DNV GL

Calculating the Uncertainty of Building Simulation Estimates

Commercial Facilities Contract Group Direct Impact Evaluation

Impact Evaluation of the 2009 California Low-Income Energy Efficiency Program

REGULATION REQUIREMENTS FOR WIND GENERATION FACILITIES

CASE STUDIES IN USING WHOLE BUILDING INTERVAL DATA TO DETERMINE ANNUALIZED ELECTRICAL SAVINGS

Laying the Foundation for Energy Efficiency Potential Estimates through Market Assessment

Looking Ahead: Identifying Common Elements of a Customer Engagement Platform

Roadmaps for Demand Response & Storage

Customer Targeting for Residential Energy Efficiency Programs: Enhancing Electricity Savings at the Meter

Southern California Edison Preferred Resources Pilot. Portfolio Design Report Revision 0

Cracking the Code: An Approach to Estimating Savings from Energy Codes

Improving Energy Retrofit Decisions by Including Uncertainty in the Energy Modelling Process

2002 STATEWIDE NONRESIDENTIAL STANDARD PERFORMANCE CONTRACT PROGRAM MEASUREMENT AND EVALUATION STUDY

Introduction to Baseline Setting: Why, What and How?

charges in the Texas (ERCOT) market Draft of December 29, 2012 Jay Zarnikau *,a,b, Dan Thal a Austin, TX 78746, USA The University of Texas at Austin,

Does It Keep The Drinks Cold and Reduce Peak Demand? An Evaluation of a Vending Machine Control Program

Coincidence Factor Study Residential and Commercial & Industrial Lighting Measures. Prepared for; New England State Program Working Group (SPWG)

M&V 2.0 Experience, Opportunities, and Challenges. Wednesday, March 30 th 11:30am 12:30pm

UNDERSTANDING YOUR UTILITY PARTNERS UTILITY ENERGY EFFICIENCY DRIVERS AND DESIGN CONSIDERATIONS

Rhode Island Power Sector Transformation Performance Incentive Mechanisms Straw Proposal

Energy Savings Made Simple

CALMAC Study ID PGE0354. Daniel G. Hansen Marlies C. Patton. April 1, 2015

4/30/2018. Connecticut Energy Efficiency Board C1630: Largest Savers Evaluation April 25, Overview

EER Assurance Background and Contextual Information IAASB Main Agenda (December 2018) EER Assurance - Background and Contextual Information

Measuring End-Use Technological and Behavioral Waste to Prioritize and Improve Program Design

SCE Comments on DEER2016 Update Scope

2011 Load Impact Evaluation of California's Statewide Base Interruptible Program

Customer Targeting for Residential Energy Efficiency Adam Scheer and Sam Borgeson

Customer Targeting to Enhance Energy Efficiency Savings at the Meter

Working Together to Manage Your Company s Energy Use.

ComEd and Nicor Gas Strategic Energy Management Program Impact Evaluation Report

Cutting Peak Demand Two Competing Paths and Their Effectiveness

Statewide Refrigerator Monitoring and Verification Study & Results

PARTICIPANT HANDBOOK

Comments of Powerex Corp. on FRAC MOO Phase 2 Supplemental Issue Paper. Submitted by Company Date Submitted

Locational Net Benefits Analysis Working Group Long Term Refinement Topics Scoping Document Prepared for the ICA Working Group by More Than Smart

Summary Paper on Controls for Metering and Feedback

Is Behavioral Energy Efficiency and Demand Response Really Better Together?

Modeling Customer Response to Xcel Energy s RD-TOU Rate

Matching and VIA: Quasi-Experimental Methods in a World of Imperfect Data

RTR Appendix. RTR for the Focused Impact Evaluation of the Home Upgrade Program (DNV GL, ED WO #ED_D_Res_5, Calmac ID #CPU0118.

The ESP Handbook. Chapter 9 Billing under CCA. Version 2.0 May 1, 2015

2012 Energizing Indiana Programs

Energy Modeling Applications for Existing Buildings

Public Interest Energy Research (PIER) Energy Savings through Smart Controls in Multifamily Housing Study: Impact Analysis

What lies below the curve

Focus on Energy Calendar Year 2014 Evaluation Report Volume II May 27, 2015

Watts App: An Energy Analytics and Demand-Response Advisor Tool

Demand Response Association of Energy Engineers. Presenter: David Wylie, P.E. ASWB Engineering

Potential Methodology to Account for OFO Penalties Incurred due to Real-time Energy Dispatches

Measurement and Verification System (M&V) For a DSM Bidding Program

ENERGY AND NATURAL RESOURCES CONSUMPTION

Appliance Recycling Program Impact Evaluation

Adversary or Ally: Industrial Strategic Energy Management and Demand Response Can They be Integrated?

M&V Fundamentals & the International Performance Measurement and Verification Protocol

HAN Phase 3 Impact and Process Evaluation Report

X Accept as requested X Change to Existing Practice Accept as modified below Decline

2017 PARTICIPANT HANDBOOK

Transcription:

Energy Efficiency Impact Study for the Preferred Resources Pilot February, 2016 For further information, contact PreferredResources@sce.com 2

1. Executive Summary Southern California Edison (SCE) is interested in measuring the savings from energy efficiency projects at the individual customer level that also may be discernable on the grid level in the Preferred Resources Pilot (PRP) area. SCE contracted FirstFuel to analyze the energy use of a sample of customers that installed energy efficiency measures (EEMs) in that area using an algorithm developed to detect significant and sustained building energy use changes independent of the date of EEM installation. These detected changes indicate that other factors are impacting the customer s energy use, such as occupancy, and the impact of the EEM may be masked or heightened by these other changes. The analysis found that 54 out of 62 customers energy use patterns (or baselines ) could be modeled with statistical confidence, but 8 could not (13%). Out of the 54 customers for which a confident baseline could be modeled, 30 of the customers appear to have behavior and building use changes that impact their net energy savings as calculated from the customer s meter. Of the customers with confident baseline models in this sample, offices, retail stores, and schools appeared least affected by behavior and building use changes and generally saw the most savings at the customer meter. 2. Project Background and Goals Southern California Edison s PRP is a multiyear study designed to determine if clean energy resources can be acquired and deployed to offset the increasing customer demand for electricity in the central Orange County region. Measuring the performance and persistence of these resources is important to determining their dependability, and measuring the impact of energy efficiency (EE) at the customer site aligns with the pilot s bottom-up approach. Over the past several decades, SCE has identified EEMs and developed EE programs to encourage customers to install EEMs across its service territory. A long-standing challenge has been extrapolating the specific impacts of these measures due to difficulty in appropriately accounting for behavior, productivity, and building use changes. In this paper, building-level impact refers to the change in energy (or power) use due to EEMs ( recorded changes ), adjusting for occupancy, behavior, or other non-eem changes ( detected changes ). Grid-level impact refers to the net change in energy (or power) use from pre-eem installation to post-eem installation, including behavioral, occupancy, or other changes (the net total of detected and recorded changes). Previous EE program evaluations have used a combination of building energy models, engineering estimates, billing analyses, surveys, and audits to estimate EEM savings extrapolated across a portfolio. These traditional estimation methods have limitations in determining kwh and kw savings attributed to a particular installation or customer site due to the complex, idiosyncratic nature of buildings and dynamic operational influences driven by equipment efficiency, weather, and non-recurring events (e.g. changes in occupancy and usage patterns). This Energy Efficiency Impact Study measures the individual building-level savings from EEMs by adjusting the baseline to incorporate detected changes. This approach is statistically rigorous and relevant for subsequent determination of the overall grid-level impact. To calculate the EE savings estimates, FirstFuel generated a baseline model for each building based on customer meter data, using machine learning algorithms. These models represent a building s energy use and behavior in varying conditions encountered by that building throughout the year. The resulting baseline models are used to evaluate the effects of energy efficiency interventions within a 62 building sample during the ex-post (i.e., evaluation) period. 2

Using the baseline model, FirstFuel constructed a weather-normalized, Business as Usual (BAU) prediction for each subject building during the post-baseline assessment period. The net difference over time between the BAU prediction and the actual (metered) consumption estimates the change in usage over the time period. This difference is an estimate of the grid-level impact that is observed upon implementing the EEMs in the building. This paper represents a short summary of the combined project work and highlights of the results. 3. Analysis Approach 3.1 Building a Baseline Commercial buildings use energy in a myriad of ways, with the typical primary drivers being heating and cooling and ambient and task lighting. A building s energy use responds to the outside weather conditions and occupancy (presence and activities of the people); weather data is widely available, however, while detailed occupancy data is more difficult to capture. Most commercial buildings exhibit repeatable operating patterns. FirstFuel developed weather-adjusted baseline models depicting these operating patterns for a full year. Once created, these models are utilized to evaluate the effects of EE interventions. In this analysis, approximately 2,500 commercial and industrial customers in SCE s Preferred Resources Pilot region were screened to find a sample that would have enough of an EE impact that the trend would still be detectable through the erratic noise of the data. The parameters to identify the subset for analysis were loosely set as customers who had installed EE measures after September 30, 2012 and had demand savings claims of at least 15 kw or had reported savings of at least 5% of their total peak kw, as stated in their ex ante savings. 3.2 Energy Use Pattern Change Events: Recorded and Detected Over time, buildings go through changes that affect their energy use. For the purpose of this report, changes are grouped into two types: (1) changes identified by the date of the EEM installation as recorded and shared by SCE, and (2) other unknown, detected change events that have a sustained effect on energy use. Recorded changes refer to known and documented installations of EEMs: the magnitude of these changes are reported to the CPUC as ex-ante savings. Detected changes are sharp, sustained changes in energy use (increase or decrease) that FirstFuel s change detection algorithm identifies. Detected change events that do not appear to bear a temporal relationship to the implemented EE measures are likely caused by some other building use or behavior change, such as change in occupancy. As part of this project, SCE provided records of EEMs in all of the subject buildings along with the estimated dates of deployment. Each EEM deployment constitutes a material change in the building energy use. Often, multiple EEMs are deployed simultaneously, but the algorithm cannot separate the impacts of changes that are coincident, therefore simultaneous EEMs are considered a single event. In the ideal scenario, the recorded EEMs are the only significant changes occurring in the building that affect energy use patterns. This scenario gives a cleaner comparison of the pre-installation and post-installation building changes. However, in many buildings, this ideal situation is not available, and other significant events occur, impacting energy use. Examples may include: (A) interior temperature set-point changes due to tenant heating/cooling requests, (B) occupancy changes, (C) office equipment and shop machinery deployment or retirement, and (D) change in type or intensity of business activity. 3

These types of large non-recurring changes (a.k.a., non-routine events, regime changes, or anomalies) can occur anywhere in the analysis period, including baseline and post-baseline, even coinciding with the recorded EEMs. Moreover, these regime changes sometimes have a large impact on the energy use patterns of the buildings, and can interfere with the validation of EEM energy savings. FirstFuel s algorithm for non-recurring event detection determines if and when statistically significant changes to a building s energy use pattern have occurred. This capability is important because it recognizes exogenous changes, allowing the adjustment of savings estimates to capture the impact of those changes. This approach is able to isolate and estimate separately the recorded EEMs from the detected (other) changes in time unless, of course, they occur simultaneously. 3.3 Estimating Energy Savings as Deviations from BAU Predictions Assuming a stable baseline with unbiased predictions, changes in energy use after the baseline period will manifest as a consistent bias between the BAU model predictions and the actual electricity consumption and demand in the post-baseline period. Energy savings are estimated as the difference between the BAU predictions of consumption (and demand) from the actual consumption (or actual demand) in the post-baseline period. The level of certainty (about magnitudes and percent changes in energy consumption) is sensitive to the length of time before and after the change for both multiple recorded changes and for detected changes. Consequently, when changes are close together, the intervals between them are shorter, and the uncertainty of the estimates is relatively larger. 4. Results 4.1 Building Analysis Plots Energy use plots depict the pre-installation baseline, model prediction, actual usage, detected events, and estimated savings over the assessment period. Each building is shown twice: at a weekly level and monthly level analysis. Examples and explanations of how the plots should be read are shown below, as well as descriptions of the 4 building categories that were identified. Actual kwh Consumption: Each customer s actual energy use (kwh) data is presented as the blue line. BAU Consumption: The business as usual prediction is shown as the green line. Confidence Bounds: Confidence bounds are shown as red lines corresponding to the 95% Confidence Interval (CI). The interpretation of these bounds is that the actual (measured) value is statistically expected to lie within the calculated confidence band with 95% probability. Recorded and Detected Changes: Overlaid on both the monthly and the weekly plots are solid bars indicating dates of the recorded changes and dashed bars indicating the detected changes. Multiple coincident or proximate changes are merged into single event bars. A change is indicated when a blue line deviates from the green line at the weekly or monthly level. When the blue line is below the green line, a reduction in energy use compared to BAU occurred, and when it is above the green line, there has been an increase in usage for that time period (week or month). When the blue line deviates sufficiently from the green line to cross the red bounds, it indicates that the deviation of that magnitude is very unlikely to have occurred by chance. 4

As with all modeling-based approaches, there is some level of statistical uncertainty resulting from, e.g., short-term random and unpredictable variations in energy use. To quantify the statistical uncertainty, the Coefficient of Variation of the Root Mean Squared Error (CVRMSE) and the 95%, 90%, 80% (and 67%) Confidence Intervals on the predicted values are provided. 4.2 Examples of Representative Analyzed Buildings Buildings are diverse and unique, encompass complex energy use patterns, and require a detailed building-by-building analysis. For this study, buildings were grouped into four categories; Categories 1-A, 1-B, 1-C, and 2. For Categories 1-A, 1-B and 1-C, total or grid-level impact is estimated, net of all detected and recorded events. Some buildings (notably Category 1-B and 1-C) require more inquiry to understand the unique EE component of that overall impact because of the detected changes. Category 2 represented a small portion of buildings that were not suitable for the approach because their use patterns were so irregular, it was not possible to create a BAU with high statistical confidence. The table below outlines all four Categories of buildings in greater detail. Building Category Explanation Number % of Total Category 1-A No detected changes, low (<30%) CVRMSE ( well run buildings with consistent use patterns), and 24 recorded changes show decreases. These are the Category 1-B ideal case. Some detected changes, but recorded changes show decrease and otherwise reasonable. 18 87% Category 1-C BAU meets fitness metrics but recorded changes result in increases. 12 Category 2 Everything that does not fit 1-A, 1-B, or 1-C. Usually due to highly erratic, non-repeating patterns of consumption. 8 13% Building Category 1-A: Ideal Buildings Building 1, below, will represent Category 1-A. It has a stable and well-fit baseline (CVRMSE = 19%). The only recorded change corresponds to an implemented LED lighting measure. Coinciding with the recorded measure implementation date is an apparent reduction in energy use of approximately 42% below the business as usual prediction, which is sustained throughout the remainder of the evaluation period. This building also does well with respect to kw demand savings during peak periods (defined in this analysis as the hours of 12-8 PM in the months of June to October). During peak demand periods, estimated reduction averaged 19.6 kw, as compared to the ex-ante estimate of 16.6 kw. Figure 1 provides a visual representation of the building s peak demand history. 1 0% CVRMSE is a perfect (errorless) model. ASHRAE Guideline 14 for Whole Building hourly calibration models specifies maximum acceptable values for CVRMSE at 30%. Figure 1 Building 1, Weekly (top) and monthly (bottom) actual use (blue), 5

Figure 1 Building 1, Weekly (top) and monthly (bottom) actual use (blue), along with BAU prediction (green), confidence intervals and change events Building 1 is a very straightforward case, with a well-fit baseline and only one recorded change whose magnitude is large compared to the model error and comprises a large fraction of total energy use. Building Category 1-B: Buildings with detected and accounted for changes Building 2 was involved in a change event not related to program participation, which would traditionally disqualify it from a whole-building savings assessment. However, FirstFuel s algorithms detect these non-routine events or regime changes and adjust the BAU reference point to try to account for their effects on building electricity use. As shown in Figure 2 below, Building 2 has a stable and well-fit baseline before the installation of the EEM (CVRMSE = 15%). However, a few months before the EEM installation, there is a major change event that results in increased consumption. The detected change results in an 83 MWh increase (projected to a full year), or a 26% increase in annual consumption of the building over business as usual. 6

Figure 2 Building 2, weekly (top) and monthly (bottom) actual use (blue), along with BAU prediction (green), confidence intervals and change events. The ability to detect these unexpected changes from BAU use patterns and estimate their effects provide insights into the behavior and use changes of the building that are impacting the meter-level savings from the EEM.. Building Category 1-C: Buildings that Raise Questions for Operator Category 1-C is typified by Building 3, in which the implemented measures have not resulted in a decrease of energy use. Building 3 has two recorded change dates, with no additional dates of significant changes detected by the FirstFuel approach. The model is very well fit (CVRMSE = 3.3% during the baseline period). Immediately following the measure implementation, we find an increase in consumption. The data shows that after the first set of EEMs, annual consumption increases by more than 1% (140 MWh annual) and after the second set of EEMs, consumption increases by 3% (344 MWh annual). In this circumstance, FirstFuel recommends contacting the building manager, owner or tenant to trouble-shoot the situation. Possible explanations may be that the implemented measures coincided with other changes to building assets or the usage patterns changed at the same time the energy efficiency measures were installed (and perhaps they are related). 7

Figure 3 Building 3, weekly (top) and monthly (bottom) actual use (blue), along with BAU prediction (green), confidence intervals and change events. Building Category 2: Intractable Buildings that May Present Opportunities Building 4 belongs to the group of buildings that experience erratic, non-repeatable energy consumption patterns and frequent changes, to the point where the baseline model fails to attain a reasonable level of prediction accuracy (baseline CVRMSE = 60%). While the event change algorithm identifies some changes, the results cannot be interpreted without additional information. However, buildings with such erratic patterns may present enormous opportunities for energy savings, via the implementation of operational measures designed to bring the building under control. There are a total of 8 Category 2 buildings in this 62 building sample set. 8

Figure 4 Building 4, weekly (top) and monthly (bottom) actual use (blue), along with BAU prediction (green), confidence intervals and change events. 9

4.3 Summary of Results Category 2 buildings are not included in the summary results below, so these results represent 54 commercial customers in the PRP region for which a confident model of energy use could be built: Across the 54 customers for which a confident baseline could be modeled, there is a net decrease in total consumption and demand after implementing EE measures. The total savings (which adjusts for detected changes in addition to the recorded changes) was 525kW, while the recorded savings (which does not adjust for detected changes and represents the net load drop at the customer meter) was 413 kw across the 54 customers. The ex-ante savings estimate for this group of customers was 1789 kw, so the total savings resulted in 29% of the ex-ante estimate while the recorded savings resulted in 23% of the ex-ante estimate. A total of 261 individual changes (EEMs) were recorded. There were an additional 46 detected events that affected the net kwh and kw savings estimates; 30 of the customers appear to have behavior and building use changes that impact their net energy savings as calculated from the customer s meter. The impact of detected events was so large in some buildings that there appears to be a net increase in energy consumption, including both recorded as well as total recorded plus detected changes. Twelve (of the 30 customers mentioned above) exhibited an overall increase in energy use after the EEM was installed. The buildings with the largest variances are mostly found in process-centric industries where the loads are less predictable and are often dependent on unknown external factors such as demands and production schedules. The analytics-enabled methodology supports the assumption that customer-driven events grossly affect the actual grid level savings. 4.4 Customer Analysis and Trends The detailed building-level analysis presented above can be combined with existing EE program data to yield new insights to help SCE with its broad energy management goals. Figure 5 depicts the kw savings of recorded changes across the entire sample population, by Building Category. This chart indicates that: While most buildings exhibit net kw reductions, for reasons mentioned previously there are several that show net increases. Building Category 2 dominates both extreme ends of savings and losses. For 2/3rds of the customers analyzed here, the implemented EE programs appear to result in some net load decrease. 10

Figure 5 - Effect of recorded changes on Building-level kw demand impacts Figure 6 Grid-level kw demand and kwh consumption impacts per building, by sector Figure 6 depicts the average grid-level impacts per building for each sector across the 54 building sample. Note the differences amongst varying building use types. 11

5. Conclusions and Next Steps Verifying the actual impact of projected ex-ante savings estimates for a population of buildings and heterogeneous set of measures is a large challenge, due to dynamic and ever-changing occupancy, usage patterns, equipment efficiency, operations, and weather. The major conclusions of this study are as follows: 1. FirstFuel s methodology estimated 531 kw of power savings from the total changes (261 recorded changes and 46 detected events) across 54 customers for which a high-confidence baseline model could be fit. A sensitivity analysis for the Category 1-A through 1-C buildings (N=54) yielded the following results: a. At a 95% Confidence Interval 2, the methodology would be able to confirm i. A 5% or less change in consumption in 73% of the buildings, and ii. A 10% or less change in consumption in 93% of the buildings b. At an 80% Confidence Interval 2, the methodology would be able to confirm i. A 5% or less change in consumption in 78% of the buildings, and ii. A 10% or less change in consumption of in 95% of the buildings. The sensitivity (width of the Confidence Interval) depends on the period of time (e.g., 30, 60, 90 or 180 days) used during the monitoring period to assess if an event did indeed occur. The shorter the period after an event happens, the more statistical uncertainty there is about the change in consumption. Conversely, the longer the monitoring period is after an event happens, the more statistically certain the change is, provided the effects of that change persist during that period. Essentially, one way to increase the certainty of analytics-enabled measurement is by providing a longer period of continuous observation. 2. From a grid impact perspective, the approach found considerable variance between the traditional ex-ante savings projections and those generated by the analytics-enabled methodology when looking at building-level impacts of EEMs. Even adjusting the BAU for detected changes, the measured savings was approximately a third of the ex-ante savings, indicating that there may be many additional building use changes that could not be accounted for with this adjustment methodology. Furthermore, traditional ex-ante estimations have limited application from a locationspecific grid impact perspective. For such granular analyses, the calculated or deemed ex-ante savings estimates are not the best proxy; these numbers have higher confidence when used in an aggregated manner, though. 3. An analytics-enabled approach, with rigorous detection of non-recurring events, can inform whether and how much the model should be adjusted to account for customer changes unrelated to EE measures. Individual buildings experience many unrecorded, large changes that often offset the load reduction of recorded EE measures at the grid level. Several of the analyses presented in this report illustrate this phenomenon (see description of category 1-B & 1-C buildings). Traditional regulatory attribution-focused, ex-ante estimation approaches do not account for large non-recurring events. The analytics-enabled methodology provides additional information by identifying the presence of customer-driven events that significantly affect the actual grid level savings from EEMs. Having this knowledge empowers system planners to ascribe discrete value, including identifying EEMs that contribute to better management of overall grid impact goals. These types of EEMs could support a pay for performance type of model. 2 Using a 180 day monitoring period 12

4. The analysis indicates that some commercial sectors may correlate with greater grid-level savings than others. Further analysis is needed to determine if grid level impact of EEMs and EE incentives in certain sectors are more discernable. The small sample analysis (N=62) does suggest clear correlations between EE program incentives and kwh savings. In particular, sectors such as Schools, Offices, and Food Stores show a positive correlation between the provision of EE incentives and kwh savings. However, other sectors like Misc. Commercial and Warehouses appear to show negative correlation i.e., net energy increase against incentive payout. In these latter sectors, EE savings may be overwhelmed by external factors such as large ramp up in production schedules. Further study (outside the scope of this study), with more detailed knowledge of production schedules etc., and a larger sample size is needed to illuminate the effectiveness of EE incentives for these sectors. The conclusions and sensitivity analysis results above demonstrate an analytics-based approach to estimating customerspecific EE savings at the grid level. The ability to draw broader, general conclusions is limited by the small sample size. This approach tested a methodology to estimate the impact of behavior changes (that can be statistically detected) on the overall energy savings. The methodology is unable to separate the behavior impact from the EEM impact when they occur simultaneously, however, and this limitation may account for the fact that the total savings estimate was about a third of the ex-ante estimate. Data quality issues may also play a part in this difference. Calculating the impact from only recorded changes, (that is, measuring the net load drop without adjustments for detected changes) is a conservative approach to measuring the energy savings, but may be an appropriate option for calculating savings at the customer level at high volume. 13