PROGRAM YEAR 1 ( ) EM&V REPORT FOR THE RESIDENTIAL ENERGY EFFICIENCY BENCHMARKING PROGRAM

Size: px
Start display at page:

Download "PROGRAM YEAR 1 ( ) EM&V REPORT FOR THE RESIDENTIAL ENERGY EFFICIENCY BENCHMARKING PROGRAM"

Transcription

1 PROGRAM YEAR 1 ( ) EM&V REPORT FOR THE RESIDENTIAL ENERGY EFFICIENCY BENCHMARKING PROGRAM Presented to: Progress Energy Carolinas Prepared by: Navigant Consulting, Inc Navigant Consulting, Inc.

2 Prepared for Progress Energy Carolinas Raleigh, North Carolina Presented by Stuart Schare Director Navigant Consulting, Inc Walnut Street Suite 200 Boulder, CO phone fax Primary contributing authors: Bethany Glinsmann Jennifer Hampton Robert Russell Evaluation Report Residential Energy Efficiency Benchmarking Program Page i

3 Table of Contents Executive Summary... iii 1. Introduction Program Description Evaluation Objectives Evaluation Methods Impact Evaluation Methodology Process Evaluation Methodology Program Impacts Program-Wide Energy Savings Energy Savings by Season Energy Savings by Subgroup Actions Driving Program Savings Participation in Other Programs Program Processes and Customer Satisfaction Web Portal Participant Recall Satisfaction Conclusions and Recommendations Conclusions Recommendations Attachment I. Regression Model Parameter Estimates Attachment II Under Separate Attachment Appendix A. Program Staff In-Depth Interview Guide Appendix B. Contractor Staff In-Depth Interview Guide Appendix C. Customer Survey Guide Evaluation Report Residential Energy Efficiency Benchmarking Program Page ii

4 Executive Summary The Residential Energy Efficiency Benchmarking (REEB) Program is part of the portfolio of energy efficiency programs offered by Progress Energy Carolinas (PEC). The program has been providing about 50,000 residential customers bi-monthly Home Energy Reports (HERs) containing information about the customer s energy usage and tips to reduce their electric bill. This report covers Navigant Consulting, Inc. s (Navigant) evaluation, measurement, and verification (EM&V) activities for the first twelve months of the REEB program (August 2011 through July 2012). The primary purposes of this evaluation are to 1) estimate gross and net annual energy impacts associated with the REEB program, and 2) identify, if possible, the specific program-induced behaviors and equipment installations that contribute to reductions in consumption. Average savings were 224 kwh or 1.23% of energy consumption during the first twelve months of the program. Total program savings were 10.6 GWh during the twelve month period of August 2011 to July Total program savings were calculated by multiplying average energy savings (pro-rated for participants that moved during the program year) by the number of participants and subtracting savings attributable to other programs. Table 1-1 shows the key impact findings. Figure 1-1 shows savings from other Navigant-evaluated HER programs across the country. Program savings are within the typical range for the first year of a behavioral program. Table 1-1. Primary Impact Findings, August 2011 July 2012 Average Consumption of Customers receiving HERs 18,179 kwh Average Savings 224 kwh Average Savings as Share of Consumption 1.23% Number of Participants 50,129 Total Savings 10.6 GWh Source: Navigant analysis Total savings are the product of average energy savings and the number of program participants, less savings attributable to other programs. Participants that moved during the program year accrue pro-rated savings. Figure 1-1. Savings from Navigant-Evaluated HER Programs Source: Navigant Consulting Evaluation Report Residential Energy Efficiency Benchmarking Program Page iii

5 Program Summary The REEB program began in July 2011 and provides bi-monthly HERs to approximately 50,000 residential customers. The HERs provide information about participants energy usage, how their energy usage compares to that of their neighbors, and action steps to reduce energy consumption. Evaluations of similar programs have found that, on average, customers respond to this information set by reducing their energy use by 1% to 3%. The program implementer, Opower, randomly assigned customers to the treatment (HER recipient) and control groups at a rate of 2:1. Participants may enroll in a web portal for more detailed information and additional energy saving tips. The portal, supported by Opower, helps enrolled customers identify actions they can take to save energy, allows them to compare their energy use and costs with that of their neighbors, and provides users with an opportunity to create a personal energy saving plan. A total of 947customers (1.9% of HER recipients) had created a web portal account as of the end of the evaluation period. Evaluation Methodology The EM&V assessment of REEB program activity during the twelve month period ending July 2012 included impact and process evaluations. The impact evaluation included analysis of monthly electricity bills for all participants and a group of control households, using bills from the pre- and post-program period. Using a fixed effects regression model, Navigant estimated the average savings per participant. Navigant s process evaluation of the REEB program addressed research objectives related to customer behavior and satisfaction with the reports. Navigant collected this data through a combination of staff interviews, a customer survey, and a web portal review. Key Program Findings In addition to quantifying average savings of 1.23%, Navigant made the following conclusions: Program Impacts 1. Seasonal savings. Average savings appear to vary by season, but the differences are not statistically significant. Average seasonal savings were highest in the spring, both in terms of kwh and % savings. The variation in seasonal savings likely reflects the program ramp up as well as seasonal fluctuations in savings; Navigant recommends caution in interpreting differences in seasonal savings as strict seasonal differences. 2. High-saving subgroups. REEB participants that also participated in the Balanced Bill 1 program reduced their annual energy usage by 2.44%. A higher proportion of these participant respondents reported taking some type of action after receiving the reports, compared to non- Balanced Bill REEB participant respondents. The top third of energy users in the REEB program realized higher energy savings, in both percentage and kwh terms, compared to the middle and bottom third of energy users. 1 The PEC Balanced Bill program bills participating customers for a fixed amount each month, rather than an amount based on metered use. The Balanced Bill program is currently being phased out Evaluation Report Residential Energy Efficiency Benchmarking Program Page iv

6 Program Processes 3. Actions driving program savings. Of the 16% of participants taking additional action after receiving the HERs, 7% are taking equipment-based actions, 7% are taking behavioral actions, and 2% are taking behavioral and equipment-based actions. Of the participants reporting behavioral actions, various forms of curtailment were the most common, with lighting and HVAC as the predominant end uses. Of participants reporting equipment-based actions, lighting and building envelop measures were the most common. Respondents reported motivations primarily relate to a desire to save, both in terms of reducing their electric bill and using less energy. Participant respondents who took additional action after receiving the reports cited the HERs as their top source of information. 4. Web portal. The vast majority of REEB participant survey respondents are not aware that the REEB program offers a web portal. Fewer than 2% of participants enrolled in the web portal during the first twelve months of the program. REEB participants enrolled in the web portal reduced their annual energy usage by 2.76%. 2 Overall, web portal visitors spend more time on administrative pages compared to pages that provide information about reducing their energy usage. Most web portal users do not visit the site more than once per month. 5. Recall. Most participants recalled receiving the HERs and stated that they always read the reports. Web portal participants had a lower recall rate (72%) compared to participants receiving only the printed reports (90%). 6. Participant satisfaction. Both participants and non-participants reported high levels of satisfaction with Progress Energy. Eighty percent of participants and 75% of non-participants are satisfied or extremely satisfied with Progress Energy. The majority of participants who receive printed HERs in the mail (58%) are satisfied with the reports. These satisfaction levels are similar to those found in previous evaluations of HER programs. Just over half of participants who receive printed HERs in the mail (56%) consider the reports valuable and the tips in the reports relevant to them. Recommendations Navigant recommends eight discrete actions for improving the REEB Program (see Table 1-2), based on insights gained through the evaluation effort for the first twelve months of the program. These recommendations provide PEC with a roadmap to fine-tune the REEB Program for continued success, and are organized around the following four broad objectives: 1. Attracting new and engaging current web portal participants 2. Increasing use of web and analytic data 3. Increasing REEB participant channeling into other PEC programs 4. Improving Program Documentation 2 Note that participants that enrolled in the web portal are likely to be highly engaged in the REEB program. These participants likely would have realized greater savings than the average REEB participant, even in the absence of the web portal Evaluation Report Residential Energy Efficiency Benchmarking Program Page v

7 Table 1-2. Summary of Recommendations Attracting New and Engaging Current Web Portal Participants 1. Proactively promote the web portal via the HER module. 2. Move forward with plans to the HER to all participants with addresses. 3. Incentivize return visits through ongoing campaigns and tool promotion. 4. Target web portal users with specific tips and engagement tactics. Increasing Use of Web and Analytic Data 5. Use web analytics for goal setting and program design. 6. Use analytics for goal setting and program design. Increasing REEB Participant Channeling into Other PEC Programs 7. Capitalize on program channeling effects by using REEB to market other PEC programs. Improving Program Documentation 8. Develop a well-documented theory of behavior change that is expected as a result of participation in the REEB program. Source: Navigant analysis. These recommendations are discussed more fully in Section Evaluation Report Residential Energy Efficiency Benchmarking Program Page vi

8 1. Introduction The Residential Energy Efficiency Benchmarking (REEB) program provides bi-monthly Home Energy Reports (HERs) to approximately 50,000 residential customers. The HERs provide information about participants energy usage, how their energy usage compares to that of their neighbors, and action steps to reduce energy usage. Navigant Consulting, Inc. (Navigant) performed evaluation, measurement, and verification (EM&V) of the first twelve months of the REEB program (August 2011 through July 2012). EM&V is a term adopted by Progress Energy Carolinas (PEC) and refers generally to the assessment and quantification of the energy and peak demand impacts of an energy efficiency or demand response program. EM&V also encompasses an evaluation of program processes and customer feedback, typically conducted through participant surveys. 1.1 Program Description The REEB Program is part of the portfolio of energy efficiency programs offered by PEC. The REEB program began in July 2011 and has been providing about 50,000 residential customers bi-monthly HERs. The HERs contain four sections: 1) Last Month Neighbor Comparison in which the customer s usage is compared to the usage of all neighbors and efficient neighbors; 2) Last 12 Months Neighbor Comparison in which the customer s monthly usage over the past year is compared to the monthly usage of all neighbors and efficient neighbors; 3) Neighbor Efficiency Rank in which the customer is ranked from 1 to 100 by monthly usage in comparison to all neighbors; and 4) Action Steps which provides tips for saving energy via behavior change or equipment purchase. Evaluations of similar programs have found that customers respond to this information set by reducing their energy use by 1% to 3%. The REEB program participants and controls were selected from the top 60% of customers based on usage. 3 The program implementer, Opower, randomly assigned customers at a rate of 2:1 to the treatment (HER recipient) and control groups. There are approximately 50,000 customers in the treatment group and 25,000 customers in the control group. Approximately 6% of REEB participant and control accounts became inactive during the twelve month period ending June In order to maintain program participation, PEC plans to refresh the participant and control list in January A refill wave of participants and controls will replace Participants may enroll in a web portal for more detailed information and additional energy saving tips. The portal, supported by Opower, helps enrolled customers identify actions they can take to save energy, allows them to compare their energy use and costs with that of their neighbors, and provides users with an opportunity to create a personal energy saving plan. Enrolled customers can also adjust several account settings through the portal. For example, users can choose how to receive the HERs; choices include by mail, by , or both. Users can also opt-out of receiving the reports altogether. PEC promotes the web portal by including a link on both the printed and ed HERs. A total of 947customers had created a web portal account as of the end of the evaluation period. 3 Customers on the do not mail list were excluded from the participant and control groups, which were designed to comprise 85% North Carolina customers and 15% South Carolina customers Evaluation Report Residential Energy Efficiency Benchmarking Program Page 1

9 Opower and PEC designate one point of contact each to conduct and oversee day-to-day program operations; Opower s Engagement Director and PEC s Senior Program Specialist work together directly. These individuals coordinate two main types of ongoing communications regarding program progress and issues. Opower s Engagement Director sends the PEC Senior Program Specialist monthly reports that include the amount of energy saved by program participants, the participant opt-out rate, and web portal traffic statistics. Opower and PEC also conduct quarterly meetings to discuss aggregate program results for the quarter, review program feedback collected via PEC s customer service call center, and to consider recommendations for adjustments to the program. Additional PEC staff members attend these quarterly meetings as needed including representatives from the EM&V, marketing and regulatory departments. In addition to coordinating with Opower, PEC s Senior Program Specialist works internally to ensure that the REEB program promotes other relevant PEC programs when possible. For example, PEC s Appliance Recycling Program implementer was given the opportunity to provide content for REEB energy tips. 1.2 Evaluation Objectives The primary purposes of this evaluation are to 1) estimate gross and net annual energy impacts associated with the REEB program, and 2) identify, if possible, the specific program-induced behaviors and equipment installations that contribute to reductions in consumption. Secondary objectives include answering the following questions: 1. Do savings vary by energy usage? 2. Do savings vary by income? 3. Do savings vary by participation in the Balanced Bill program? 4. What are current levels of customer satisfaction regarding the program? 5. What actions are customers taking to achieve electricity savings? Specifically, are customer actions behavioral or equipment-related? How does the program influence those actions? 6. Does participation in the REEB program influence participation in other PEC programs? Ultimately, PEC can use these results for reporting impacts to the North Carolina Utilities Commission (NCUC) and the Public Service Commission of South Carolina Evaluation Report Residential Energy Efficiency Benchmarking Program Page 2

10 2. Evaluation Methods To evaluate the first year of the REEB program, Navigant conducted both impact and process evaluations. The impact evaluation used monthly billing data for participants and controls to quantify energy savings. The process evaluation consisted of staff and contractor interviews, participant and nonparticipant phone surveys, and an analysis of the web portal analytic data. This section of the report describes each of these evaluation activities in greater detail. 2.1 Impact Evaluation Methodology Navigant s impact evaluation of the REEB program addressed research objectives related to the quantification of program energy savings. Navigant collected billing data and program tracking data to complete the impact analysis Description of the Data PEC provided Navigant with monthly billing data for all program participants and controls spanning the period of April 2009 to July Opower provided Navigant with additional information on program participants and controls, including the date of the first report and a list of participants enrolled in the web portal. In preparation for the impact analysis, Navigant combined and cleaned the data provided by PEC and Opower. The dataset included 50,129 participants and 25,087 controls. Navigant removed the following customers and data points from the analysis: 715 customers with negative kwh values Customers with no first report date Outliers, defined as observations that differed from the mean by more than one order of magnitude 4 Approximately 7% of participant and control accounts became inactive over the 15 month period of April 2011 through June The impact analysis includes all accounts up until the point of inactivation. Figure 2-1 shows the percentage of newly inactive participants and controls each month. 4 Observations with average daily usage less than kwh or greater than kwh were removed from the dataset Evaluation Report Residential Energy Efficiency Benchmarking Program Page 3

11 % of Newly Inactive Customers 0.7% 0.6% 0.5% 0.4% 0.3% 0.2% 0.1% 0.0% Figure 2-1. Inactive Participants and Controls, by Month of Inactivation Apr May Jun 2011 Jul Aug Sep 2011 Oct Nov Dec 2011 Jan 2012 Feb 2012 Mar 2012 Apr May Jun 2012 Participant Control Source: Navigant analysis As of July 2012, 0.7% of participants (350 customers) opted out of the HER program. This rate is typical of other HER programs that Navigant has evaluated. Figure 2-2 shows the number of participants that opted out in each month of the analysis period. Number of New Opt-Outs Jul Aug Figure 2-2. Participants that Opt Out of HER Program, by Month Sep 2011 Oct Nov Dec 2011 Jan 2012 Feb Mar Apr May Jun 2012 Jul 2012 # of New Opt-Outs Cumulative % of Opt-Outs 0.8% 0.7% 0.6% 0.5% 0.4% 0.3% 0.2% 0.1% 0.0% % of Participant Households that have Opted Out Source: Navigant analysis Evaluation Report Residential Energy Efficiency Benchmarking Program Page 4

12 2.1.2 Validation of Control Group The program implementer, Opower, created a Randomized Controlled Trial (RCT) by randomly assigning customers 5 to the participant and control groups at a rate of 2:1. Use of a RCT eliminates the possibility of selection bias. Prior to the impact analysis, Navigant verified that customers were randomly assigned to the participant and control groups by comparing the average monthly energy usage of the participants and controls during the 18 month period prior to the start of the program (December 2009 to May 2011). Navigant found that the participant and control groups were selected via random assignment, as expected. Figure 2-3 depicts the average energy usage for participants and controls during the 18 months prior to the start of the HER program. The blue line indicates the average energy usage for controls and the red dashed line indicates the average energy usage for participants. The two lines are essentially identical, indicating no difference in average usage patterns for participants and controls prior to the start of the HER program. Navigant conducted a statistical test on the difference in the mean energy usage for the two groups in each of the 18 months and found all differences to be statistically insignificant at the 90% confidence level. Figure 2-3. Mean Energy Usage for Participants and Controls, by Month Dec-09 Jan-10 Feb-10 Mar-10 Apr-10 May-10 Average Daily Usage (kwh) Jun-10 Jul-10 Aug-10 Sep-10 Oct-10 Nov-10 Dec-10 Jan-11 Feb-11 Mar-11 Apr-11 May-11 Controls Participants Source: Navigant analysis 5 The HER program comprises the top 60% of customers based on usage. Customers on the do not mail list were excluded from the treatment and control groups. The program was designed to comprise 85% North Carolina customers and 15% South Carolina customers Evaluation Report Residential Energy Efficiency Benchmarking Program Page 5

13 2.1.3 Description of Model Navigant estimated program impacts via linear fixed effects regression (LFER) analysis. The LFER model combines both cross-sectional and time series data in a panel dataset. The regression equation is given by: Equation 1 = Where, ADCkt Postt Participantk, = The average daily usage in kwh for customer k during billing cycle t. This is the dependent variable in the model. = A binary variable indicating whether bill cycle t is in the post-program period (taking a value of 1) or in the pre-program period (taking a value of 0). = A binary variable indicating whether customer k is in the participant group (taking a value of 1) or in the control group (taking a value of 0). = The customer-specific fixed effect (constant term) for customer k. The fixed effect controls for all customer-specific effects on energy usage that do not change over time, such as the number of household members, the size of the dwelling, or a thermostat that is always set at 62F. = Regression parameters corresponding to the independent variables. = The cluster-robust error term for customer k during billing cycle t. Clusterrobust errors account for heteroscedasticity and autocorrelation 6 at the customer level. The customer-specific fixed effect,, captures all customer-specific effects on electricity use that do not change over time, including those that are unobservable. The parameter captures the average effect among control customers of being in the post-program period. In other words, it captures the effects of exogenous factors, such as an economic recession, that affect all customers in the post-program period but not in the pre-program period. The sum + captures the average effect among participants of being in the post-program period. The direct effect of the REEB program on energy usage is captured by the parameter. In other words, this coefficient captures the difference-in-difference in average daily energy consumption between the participants and the control group across the pre- and post-program periods. Figure 2-4 provides a visual representation of the fixed effects model parameters. 6 Ordinary Least Squares (OLS) regression models assume the data are homoscedastic and not autocorrelated. If either of these assumptions is broken, the resulting standard errors of the parameter estimates are likely underestimated. A random variable is heteroscedastic when the variance is not constant. A random variable is autocorrelated when the error term in this period is correlated with the error term in previous periods Evaluation Report Residential Energy Efficiency Benchmarking Program Page 6

14 Figure 2-4. Fixed Effects Regression Analysis Provides a Difference-in-Difference Estimator of Program Savings Source: Navigant The basic regression equation may be expanded to test for differences in program impacts for subgroups of participants. Navigant tested for differences for low income customers, balanced bill participants, low energy usage customers, and high energy usage customers. The expanded regression equation is given by: Equation 2 = Where all variables are described as above and Subgroupk is a binary variable indicating whether customer k is in the subgroup of interest. The sum + captures the average effect among control customers in the subgroup of being in the post-program period. The sum captures the average effect among participants in the subgroup of being in the post-program period. The direct effect of the REEB program on energy usage among participants in the subgroup is captured by the sum Selection of Matched Controls for Web Portal Participants REEB participants may enroll in the web portal to receive more detailed information about their energy usage and additional energy saving tips. The opt-in nature of the web portal creates the possibility for selection bias. Navigant compared the average monthly energy usage of the web portal participants, Evaluation Report Residential Energy Efficiency Benchmarking Program Page 7

15 non-web portal participants, and controls during the 18 month period prior to the start of the program (December 2009 to May 2011). Navigant found that participants that enrolled in the web portal had higher average usage prior to the REEB program. This difference is largest during the summer months. Figure 2-5 depicts the average energy usage for web portal participants, non-web portal participants, and controls during the 18 months prior to the start of the REEB program. The green line indicates the average energy usage for web portal participants. The blue line indicates the average energy usage for controls and the red dashed line indicates the average energy usage for non-web portal participants. The blue and red lines are essentially identical, indicating no difference in average usage patterns for non-web portal participants and controls. However, the green line is above the red and blue lines in each of the 18 months, indicating that participants who enroll in the web portal tend to use more energy. Therefore Navigant selected a subset of control customers to serve as the controls for web portal participants. Figure 2-5. Mean Energy Usage for Web Portal Participants, Non-Web Portal Participants, and Controls, by Month 90 Average Daily Usage (kwh) Source: Navigant analysis Navigant selected a matched control for each web portal participant. Navigant compared the participants and controls average monthly usage in each of the twelve months prior to the REEB program. The control customer with the minimum sum of squared differences (SSD) was selected as the matched control for each participant. After selecting the matched controls, Navigant estimated the following regression equation to determine REEB program impacts for web portal participants. Equation 3 Control Treatment, no web portal Treatment, web portal = Evaluation Report Residential Energy Efficiency Benchmarking Program Page 8

16 Where all variables are described as above and Webk is a binary variable indicating that participant k (or their matched control) has enrolled in the web portal. The direct effect of the REEB program on energy usage among participants enrolled in the web portal is captured by the sum Process Evaluation Methodology Navigant s process evaluation of the REEB program addressed research objectives related to customer behavior and satisfaction with the reports. Navigant collected this data through a combination of staff interviews, a customer survey, and a web portal review In-Depth Interviews Navigant conducted in-depth interviews with PEC and Opower staff to gather information about roles and responsibilities, program goals, objectives and structure, customer feedback, and data tracking. The evaluation team conducted one-on-one interviews with two PEC staff members, and one Opower staff member. Table 2-1 summarizes the interview topic areas and objectives. Table 2-1. In-Depth Interview Topics and Objectives Topic Area Topic Objective Roles and Responsibilities Determine duties of PEC and contractor staff Identify key individuals for future conversations Determine how PEC staff interacts with the program contractor Program Goals, Objectives, and Structure Understand PEC s motivation for creating the program Identify program goals and objectives Understand future plans for the program Confirm evaluation objectives Understand processes and operations of program Request program materials and other data Customer Feedback Assess customer response to the program Understand how customer feedback is captured and used Identify available data on customer feedback Data Tracking Understand how program data is captured and used Source: Navigant Consulting Appendix A presents the full PEC program staff interview guide and Appendix B presents the contractor interview guide Customer Survey At Navigant s direction, a subcontractor conducted a telephone survey of 316 participant and 322 nonparticipant PEC customers. Navigant designed the survey to compare and contrast participant and nonparticipant perceptions and to develop an understanding about how the REEB program affects customer awareness and actions related to electricity. The evaluation team also included a battery of questions about customer behavior in the PEC residential lighting General Population Survey (GPS). 7 7 Apex Analytics conducted the GPS survey as a separate effort outside of this program evaluation. Navigant may use the responses received in future evaluations Evaluation Report Residential Energy Efficiency Benchmarking Program Page 9

17 Customer Survey Sample Navigant stratified the REEB participants and controls on several dimensions, defined as follows: Participant: PEC customer included in the Opower treatment group. Non-Participant: PEC customer included in the Opower control group. Low Income: PEC customer with income at or below 150% of the Federal poverty line 8 Balanced Bill: PEC customer enrolled in the PEC Balanced Bill program. The PEC Balanced Bill program bills participating customers for a fixed amount each month, rather than an amount based on metered use. Online: PEC customers that have registered on the PEC website, have opted-in to receive special offers, and have created a REEB web portal account (if a REEB participant). Table 2-2 defines each strata and lists the quota and number of completes. Navigant stratified the population of REEB participants and controls by income, participation in the balanced bill program, and enrollment in PEC s web site, per the definitions given above. Navigant provided the subcontractor with a randomized customer sample for each stratum. Table 2-2. Customer Survey Sample Strata, Quotas and Actual Completes Strata Participant Low Income Balanced Bill Online Quota Actual Completes 1 Y Y Y N Y Y N N Y N Y N Y N N N Y n/a n/a Y N Y Y N N Y N N N N Y N N N N N N n/a n/a Y Total Customer Survey Instrument The customer survey included ten sections, each with a different purpose. Several batteries of questions were only asked of individuals within certain strata. Table 2-3 notes each section, its purpose, and its relevant strata. 8 According to the US Department of Health and Human Services 2012 Poverty Guidelines: Evaluation Report Residential Energy Efficiency Benchmarking Program Page 10

18 Table 2-3. Customer Survey Question Categories, Purpose, and Relevant Strata Section Title Purpose Strata Introduction Explain purpose of call Confirm caller identity and PEC customer status Energy Awareness Assess general awareness of energy bills Assess general response to energy bills Home Energy Reports (Printed) General Online Behaviors Home Energy Reports ( ) Assess awareness of printed reports Assess participant satisfaction with printed reports Understand how recipients use printed reports Identify participant actions taken to lower home energy use Understand how customers use the internet Assess how customers receive information about energy Assess awareness of reports Assess participant satisfaction with reports Understand how recipients use reports Identify participant actions taken to lower home energy use Web Portal Assess awareness of web portal Assess participant satisfaction with web portal Identify participant actions taken to lower home energy use Behavior Changes Identify non-participant actions taken to lower home energy use Equipment and Program Participation Identify how customers use energy equipment in their home Understand whether customers recall participating in other PEC programs Overall Satisfaction Assess satisfaction with PEC All All All 1-4 Only All 5 Only 1-5 Only 6-10 Only All General Information Understand general respondent characteristics All Home Characteristics Understand respondent home characteristics All Source: Navigant Consulting. Appendix C presents the full customer survey guide Customer Survey Analysis Navigant used SPSS software to create survey response tabulations and to identify statistical correlations across various data points. The evaluation team reviewed overall response frequencies for survey questions related to participant recall, participant engagement, participant/non-participant satisfaction, and participant/non-participant actions taken. Navigant also tested for statistically significant differences between strata combinations. Table 2-4 summarizes the SPSS cross tabulations that yielded significant results covered in this report Evaluation Report Residential Energy Efficiency Benchmarking Program Page 11

19 SPSS Cross Tabulation Outcome Title Table 2-4. SPSS Cross Tabulation Outcomes Relevant to Findings Strata Combination Purpose Common Questions 1 vs. 2 vs. 3 vs. 10 Compare responses to questions asked of all strata across all strata. *Non-Web Portal vs. Web Portal 1-4 vs. 5 Compare responses between participants who have created an online account to those who have not. All Participants 1 vs. 2 vs. 3 vs. 5 Compare responses between each participant strata. *Balanced Bill vs. Non-Balanced Bill 1&3 vs. 2&4 Compare responses from Balanced Bill REEB participants to Non-Balanced Bill REEB participants. *Low Income vs. Non-Low Income 1&2 vs. 3&4 Compare responses from low income REEB participants to non-low income REEB participants. Source: Navigant Consulting. *Outcome results weighted by proportion of the population represented by each stratum. Navigant created additional SPSS cross tabulations but did not include them here as they did not yield statistically significant findings Equipment-Based vs. Behavioral Actions Navigant asked participant respondents what type of actions they took to save energy before and after receiving the HERs. Responses to these questions provide insight on what actions customers take to achieve energy savings, and whether those actions are equipment-based or behavioral. Survey questions were open ended to allow respondents to come up with answers on their own. The evaluation team then categorized the responses as either behavioral or equipment-based. For this purpose, Navigant defines equipment as any widget responsible for generating the savings without human action after the customer installs it. Navigant defines behavior as any non-equipment related action taken by a person that results in energy savings or savings that wouldn t otherwise happen without a person adjusting the equipment s settings or on/off status. Table 2-5 lists examples of responses and their relevant categories Evaluation Report Residential Energy Efficiency Benchmarking Program Page 12

20 Equipment-Based Table 2-5. Action Categories and Example Actions Behavioral Install ceiling or attic fans Install storm windows Insulate doors Install heat pump Install efficient light bulbs Replace appliances with energy efficient option Insulate attic Turn off or turn down air conditioning Adjust programmable thermostat settings Shut down or use sleep option for electronics (computer, game consoles) when not in use Line dry washed clothing Lower water heater temperature Use only warm (or cold) water for washing clothes Turn off lights Source: Navigant Consulting. Note: Survey questions were open ended to allow respondents to come up with answers on their own. Not an exhaustive list of actions Web Portal Review Opower presented Navigant with an overview of the web portal and provided access to the demonstration site so the evaluation team was able review the site s tools and features in-depth. The presentation and site access helped our team understand how the portal is structured, how the tools work, and what REEB participants experience when visiting the site. As described in Section , the customer survey included a battery of questions about the web portal to assess participant awareness and perceptions of the portal. Opower also provided Navigant with a year s worth of web portal analytics data to allow our team to assess visitor usage patterns. The data covered traffic from July 2011 to June Table 2-6 describes each data point provided by Opower. The evaluation team reviewed this data with several intentions: assess monthly usage patterns to identify traffic patterns over time; analyze which individual pages users visit, and for how long; and review whether visitors returned to the site within each month. Navigant also categorized each individual page into one of several categories to better review user patterns on individual page types. Categories included administrative pages, educational tools, and the home page Evaluation Report Residential Energy Efficiency Benchmarking Program Page 13

21 Data Total number of site visits by month Total number of unique visitors by month Number of unique visitors by individual page Number of page views by individual page Table 2-6. Web Portal Analytic Data Provided by Opower Description The aggregate number of times users visited the site, including new and repeat visits. The aggregate number of individual people that visited the site. The number of individual people that viewed each specific page. The number of times users viewed each specific page. Average time spent on site by month The overall average time users spent on the site each month. Average time spent on each individual page The average time users spent on each specific page. Source: Opower web analytic data Evaluation Report Residential Energy Efficiency Benchmarking Program Page 14

22 3. Program Impacts In this section of the report, Navigant provides the program-wide energy savings as well as energy savings by season and for various subgroups of customers. The evaluation team discusses the behaviors and equipment-based actions that underlie the energy savings. The section concludes with a discussion of the program channeling analysis. 3.1 Program-Wide Energy Savings REEB participants reduced their annual energy usage by 224 kwh (1.23%) on average in response to the REEB program. Savings are statistically significant at the 90% confidence level, with relative precision of 16%. Savings of 1.23% are in the typical range for the first year of a behavioral program. Figure 3-1 shows savings from other Navigant-evaluated HER programs across the country. Figure 3-1. Savings from Navigant-Evaluated Home Energy Report Programs Source: Navigant Consulting. The REEB program had 50,129 participants, including 3,564 participants that closed their PEC accounts during the first twelve months of the program. Total program savings were calculated by multiplying average energy savings (pro-rated for participants that moved during the program year) by the number of participants and subtracting savings attributable to other programs, as discussed in Section 3.5. Total REEB program savings from August 2011 through July 2012 are 10.6 GWh, net of savings from other programs. 9 A key feature of the RCT design (randomized controlled trial, as noted in Section 2.1) of the REEB program is that the analysis inherently estimates net savings in that there are no participants who might otherwise have received the individualized reports in the absence of the program. While some customers receiving reports may have taken energy conserving actions or purchased high efficiency equipment anyway, the random selection of program customers (as opposed to voluntary participation) implies that the control group of customers not receiving reports is expected to exhibit the same degree of energy conserving behavior and purchases. Thus, there is no free ridership, and no net-to-gross adjustment is necessary. The exception is when the REEB program drives higher participation rates in other energy 9 Total savings are 10.7 GWh and savings attributable to other programs are less than 0.1 GWh Evaluation Report Residential Energy Efficiency Benchmarking Program Page 15

23 efficiency programs; these additional savings must be tracked in order to avoid double counting, as described in Section Energy Savings by Season Average savings appear to vary by season, but the differences are not statistically significant. Average seasonal savings were highest in the spring, both in terms of kwh and % savings. Navigant s previous evaluations of HER programs have shown that savings typically ramp up during the first 12 months of the program. Therefore Navigant recommends caution in interpreting differences in savings across seasons as strict seasonal differences. Instead, the slight variation in seasonal savings likely reflects the program ramp up as well as seasonal fluctuations in savings. Navigant s evaluation of the second year of the REEB program will investigate seasonal savings trends absent the program ramp up trend. Figure 3-2 shows program savings by season, both in terms of percent savings and kwh savings (shown with 90% confidence interval). Section 3.4 discusses participant-reported actions that could result in peak savings. Percent Savings 1.8% 1.6% 1.4% 1.2% 1.0% 0.8% 0.6% 0.4% 0.2% 0.0% Figure 3-2. REEB Program Savings by Season Annual Summer Fall Winter Spring Average Seasonal Savings (kwh/hh) Summer Fall Winter Spring 90% Confidence Interval Savings Estimate Source: Navigant analysis. Note: Variation in seasonal savings likely reflects the program ramp up as well as seasonal fluctuations in savings. 3.3 Energy Savings by Subgroup Navigant tested for differences in program impacts for low income customers, balanced bill participants, low energy usage customers, high energy usage customers, and customers enrolled on the REEB web portal. Low income 10 REEB participants reduced their annual energy usage by 238 kwh (1.35%) on average in response to the REEB program, compared to 225 kwh (1.22%) for non-low income REEB participants. This difference is not statistically significant at the 90% confidence level. Figure 3-3 shows program savings by income level, both in terms of percent savings and kwh savings (shown with 90% confidence interval). 10 PEC defines low income customers as having incomes less than 150% of the federal poverty line Evaluation Report Residential Energy Efficiency Benchmarking Program Page 16

24 Percent Savings Source: Navigant analysis. Figure 3-3. REEB Program Savings by Income Level REEB participants who also participated in the Balanced Bill 11 program reduced their annual energy usage by 455 kwh (2.44%) on average in response to the REEB program 12, compared to 216 kwh (1.19%) for REEB participants that did not participate in the Balanced Bill program. This difference is statistically significant at the 90% confidence level. Figure 3-4 shows program savings by participation in the Balanced Bill program, both in terms of percent savings and kwh savings (shown with 90% confidence interval). Note that although Navigant found that REEB participants that also participate in the Balanced Bill program have higher savings than REEB participants that do not participate in the Balanced Bill program, the evaluation team cannot conclude that the higher savings are a direct result of the Balanced Bill program. It could be the case that customers that enroll in the Balanced Bill program are systematically different from other customers and that they would have realized higher savings from the REEB program even if they had not also participated in the Balanced Bill program. Percent Savings 1.6% 1.4% 1.2% 1.0% 0.8% 0.6% 0.4% 0.2% 0.0% 3.0% 2.5% 2.0% 1.5% 1.0% 0.5% 0.0% Figure 3-4. REEB Program Savings by Participation in the Balanced Bill Program Source: Navigant analysis. All Low Income Not Low Income All Balanced Bill Not Balanced Bill Average Annual Savings (kwh/hh) Average Annual Savings (kwh/hh) All Low Income Not Low Income 90% Confidence Interval Savings Estimate All Balanced Bill Not Balanced Bill 90% Confidence Interval Savings Estimate 11 Approximately 5% of REEB participants and controls were enrolled in the Balanced Bill program. 12 The regression model compares REEB participants and controls that also participate in the Balanced Bill program to estimate the savings for REEB participants in the Balanced Bill program Evaluation Report Residential Energy Efficiency Benchmarking Program Page 17

25 The evaluation team divided the REEB participants into three equally-sized groups based on their annual energy usage. The top third of energy users reduced their annual energy usage by 398 kwh (1.64%) on average. The middle and bottom thirds of energy users reduced their annual energy usage by 176 kwh and 114 kwh (1.03% and 0.86%) on average, respectively. Savings for high energy users are statistically significantly greater than savings for middle and low energy users at the 90% confidence level. Figure 3-5 shows program savings by income level, both in terms of percent savings and kwh savings (shown with 90% confidence interval). Percent Savings 1.8% 1.6% 1.4% 1.2% 1.0% 0.8% 0.6% 0.4% 0.2% 0.0% Figure 3-5. REEB Program Savings by Usage Level All High Medium Low Average Annual Savings (kwh/hh) All High Medium Low 90% Confidence Interval Savings Estimate Source: Navigant analysis. In the first 12 months of the HER program, 1.9% of participants (947 customers) enrolled in the web portal. REEB participants enrolled in the web portal reduced their annual energy usage by 543 kwh (2.76%) on average in response to the REEB program and web portal. Savings are statistically significant at the 90% confidence level, with relative precision of 48%. The savings estimate has a wide confidence interval due to the small number of participants enrolled in the web portal and some of these participants having only been enrolled in the web portal for a few months at the time of this analysis. This finding is consistent with Navigant s evaluation of a similar behavioral program featuring an opt-in web portal component. Navigant and Opinion Dynamics Corporation conducted an evaluation of an opt-in behavior-based web portal program implemented by C3 for Western Massachusetts Electric Company (WMECo). The web portal included a reward points system linked to participants achieved energy savings. While there was a high degree of uncertainty due to small sample size and the opt-in structure, the evaluation established that participants who opted to create a web portal account saved more energy than participants who only received Energy Savings Reports (ESR). The report s estimate of annual savings by households with a web portal account was 5.7% compared to 0.36% among participants who only received ESRs. 13 Note that although Navigant found that REEB participants enrolled in the web portal have higher savings than other REEB participants, the evaluation team cannot conclude that the higher savings are a result of the web portal itself. Participants that enrolled in the web portal are likely to be highly engaged 13 Opinion Dynamics Corporation with Navigant Consulting. Massachusetts Three Year Cross-Cutting Behavioral Program Evaluation Integrated Report. July Cutting_Behavioral_Program_Evaluation_Integrated_Report_ pdf Evaluation Report Residential Energy Efficiency Benchmarking Program Page 18

26 in the REEB program. These participants likely would have realized greater savings than the average REEB participant, even in the absence of the web portal. 3.4 Actions Driving Program Savings Navigant asked respondents if they took additional actions after receiving the HERs, and if so, what actions they took. As described in Section 2.2, the evaluation team then categorized these responses as behavioral or equipment-based actions. Participants reported actions after receiving the HERs were roughly equally split between behavioral actions and equipment-based actions. 14 Of the 16% of participants taking additional action after receiving the HERs, 7% are taking equipment-based actions, 7% are taking behavioral actions, and 2% are taking both behavioral and equipment-based actions. 15, 16 Navigant noted several significant differences when comparing these responses between individual strata, as discussed later in this section. Figure 3-6 summarizes the breakdown of these responses among all participants respondents (25%) reported taking additional actions after receiving the HERs. Results for each stratum must be weighted to extrapolate to the participant population. After this weighting, Navigant concludes that 16% of participants are taking additional actions after receiving the HERs. 15 Navigant will attempt to quantify savings for participants reporting additional actions in the evaluation of the second program year. Findings are not likely to be statistically significant given the small sample size. 16 All participants respondents were asked a series of questions about whether they took energy efficiency actions: 1) Prior to receiving the home energy reports [printed or ] did you ever take actions to reduce your home s energy consumption? and if yes, What actions did you take? followed by 2) After you started receiving the home energy reports [printed or ] from Progress Energy, did you do anything to reduce your home s energy consumption that you weren t doing before receiving the reports? and if yes, What actions did you take? This data is based on respondent memory; attribution bias is possible. It s important to note that it is difficult for respondents to accurately recall actions they have taken in the past. Further, it is possible that respondents inaccurately recalled the timing of their actions before or after the home energy reports Evaluation Report Residential Energy Efficiency Benchmarking Program Page 19

27 Figure 3-6. Summary of Behavioral vs. Equipment-Based Breakdown of Additional Actions Taken by Participants after Receiving the Home Energy Reports Source: Navigant analysis of customer survey responses. Includes all participant respondents (strata 1-5). When asked what actions they took, the 79 participants reporting actions indicated taking a total of 105 new energy efficient actions after receiving the HERs. Figure 3-7 shows the number of respondents that reported having taken each action Evaluation Report Residential Energy Efficiency Benchmarking Program Page 20

28 Figure 3-7. Behavioral vs. Equipment-Based Breakdown of Additional Actions Taken by Participants after Receiving the Home Energy Reports # of Respondents Behavior Equipment Other * Indicates equipmentcategories with PEC program incentives or rebates # Indicates actions that affect peak demand Source: Navigant analysis. n=79 respondents. The above figure shows the diversity of participant actions. For new behavior actions, participants focused on various forms of curtailment with lighting and HVAC as the predominant end uses. None of the behavior actions were associated with specific Progress Energy programs. For equipment-based actions, participants focused on lighting and building envelope measures. However, four respondents reported purchasing energy efficient appliances. Of these measures, lighting, insulation, and AC measures were eligible for Progress Energy program incentives or rebates (Twist and Save and Home Energy Improvement). While these responses do not substantiate causation between participation in the program and specific energy efficient actions, they do show that a quarter of all sampled participants are taking new actions after receiving the HERs. As noted earlier, after applying the appropriate weights by survey strata, Navigant estimates that 16% of REEB participants are taking new actions after receiving the HERs. This finding is only an indicator of influence and does not suggest that the reported actions led directly to the estimated savings. Use of monthly billing data does not allow for quantification of peak load impacts. To provide an indication of the potential peak load impacts of the REEB program, the evaluation team categorized participant self-reported actions as affecting peak demand, off-peak demand, or both. Participant selfreported actions taken after receiving the HERs that could impact peak demand include: turning off/down the air conditioner, other air conditioner measures, installing a ceiling fan, installing building Evaluation Report Residential Energy Efficiency Benchmarking Program Page 21

29 envelope measures (window, door, and insulation measures), and use of blinds, drapes, and shutters. Participants also reported unplugging appliances and lowering the temperature setting on their water heater; these actions reduce consumption throughout the day, during both peak and off peak-hours. Additional self-reported actions include shutting off or putting to sleep electronics, turning off lights, installing energy efficient lighting and appliances, setting programmable thermostats to 69 degrees or lower, using warm or cold water to wash clothes, and line drying clothes; these actions have the largest impacts during off-peak hours. Although only 16% of participants are taking additional actions after receiving the HERs, many of these actions have the potential to reduce peak load. Among the 16% of participants taking additional action after receiving the HERs, approximately one fourth are taking at least one action related to another PEC program, such as the Home Energy Improvement Program. Figure 3-8 shows the proportion of participant responses related to other PEC programs. Figure 3-8. PEC Program-Related Additional Actions Taken by Participants after Receiving the Home Energy Reports Source: Navigant analysis of customer survey responses. Includes all participant respondents (strata 1-5). Navigant asked these same respondents what motivated them to take action. Respondents reported motivations primarily relate to a desire to save. A large percentage (45%) of respondents listed lower electric bills as their motivation and 37% cited a desire to use less energy. Figure 3-9 shows how respondents described their motivation for taking additional action after receiving the HERs Other includes home energy reports with 1% of respondents mentioning them as a motivation Evaluation Report Residential Energy Efficiency Benchmarking Program Page 22