A Comparison of Energy Efficiency Metrics for School Buildings

Size: px
Start display at page:

Download "A Comparison of Energy Efficiency Metrics for School Buildings"

Transcription

1 A Comparison of Energy Efficiency Metrics for School Buildings Napong Tanatammatorn, Timothy R. Anderson Dept. of Engineering and Technology Management, Portland State University, Portland, OR - USA Abstract--A set of 49 elementary, middle, and high schools from a single school district are examined using five existing metrics and compared with the data envelopment analysis, DEA, benchmarking tool. The DEA results are then decomposed to examine the impact of scale efficiency and the results are compared to another a study of buildings in Taiwan. I. INTRODUCTION Peer efficiency evaluation using data envelopment analysis (DEA) was first introduced by Charnes, Cooper, and Rhodes in the late 1970s [1,2]. Since its introduction, DEA has been applied in thousands of applications [3] but examining building energy efficiency has been infrequently studied. A pair of papers stand out on using performance of buildings in Taiwan to find patterns of efficiency and inefficiency [4,5]. This paper uses the same framework as [4] for measuring the effectiveness of facility energy management with a dataset of schools from within a single school district to ensure homogeneity of the buildings. Given the current budget challenges of public school districts nationwide, opportunities for efficiency improvements could be very useful. The school district examined was the West Contra Costa Unified School District near San Francisco, California. The data was tabulated in a report by MIT researchers Rogelio Palomera-Aris and L. K. Norford for The California Energy Commission s Public Interest Energy Research (PIER) program [6]. Lee and Lee s paper [4] proposed the use of DEA to separate scale factors from management factors to obtain energy management effectiveness of 47 government buildings in Taiwan. It was stated that the focus on management factors may serves as an optional indicator to the traditional single benchmarking indicator, which measures only the overall building energy performance by focusing on energy consumption per unit output i.e. floor area. Lee and Lee presented the relationships among DEA s overall efficiency, pure technical efficiency, and scale efficiency. It was concluded that high scale efficiency rating indicates direct relationship between poor energy efficiency and ineffective energy management, and that buildings with a poorer scale usually have better energy management than appeared. Palomera-Arias et al. s paper developed a benchmarking system, called Ranking Index, for school energy consumption using school statistics and energy consumption data. The report analyzed 49 schools (39 elementary schools, 5 middle schools, and 5 high schools in year ) in the West Contra Costa Unified School District (WCCUSD) located on the northeastern San Francisco Bay area. The report highlights the best and worst energy users, which reveal schools in need of improved energy conservation measures. This benchmarking system was referred to that it can be copied by districts across the country to presents energy-use data to help facility managers determine how well each building is performing [6]. Our paper compares the distribution of pure technical efficiency result and correlations among DEA s efficiency results with the original methodology paper (by Lee and Lee). In addition, this paper contrasts the DEA efficiency results with the ranking results from the original data paper (by Palomera-Arias et. al.). II. LEE AND LEE S METHODOLOGY Lee and Lee s benchmarking process consisted of 3 steps: a regression analysis, DEA efficiency analysis, and scale/management factors examination. First, a regression analysis was used to determine climate-adjusted energy consumption since the evaluated buildings were located in different regions in Taiwan. The regression technique is recommended by the U.S. Environmental Protection Agency (EPA) and considered an accepted method in the building energy performance field [4]. Second, two DEA models, the constant returns to scale DEA model (CRS) and the variable returns to scale DEA model (VRS) following the work by Charnes, Cooper, and Rhodes [1] and Banker, Charnes, and Cooper [7], were used to determine overall building energy performance and energy management effectiveness. The models used were inputoriented having the climate-normalized energy consumption as a single input, and building area and number of occupants as outputs. The CRS model was used to determine DEA overall efficiency (OE) and the VRS model was used to determine DEA pure technical efficiency (PTE). Third, the DEA scale efficiency (SE) is calculated by dividing OE by PTE, which represented the scale factor that separated management factors from the overall efficiency. The distribution of the PTE results was presented in the paper to show the management performance of the evaluated buildings. In addition, the relationships among the 3 DEA efficiencies (OE, PTE, and SE) were presented graphically but the correlation results were not provided. The results were discussed in the paper as shown below. The DEA s PTE indicates management performance of evaluated buildings. Most of the buildings with an OE lower than 60% have a PTE below 80%. The PTE shows a trend of declining with the OE. 2867

2 The SE of these buildings falls mostly in the range of %. Thus, it can be concluded that poor energy efficiency is mainly a result of ineffective energy management. The PTE is improved as SE declines. This may mean that buildings with a poorer scale usually have better energy management. III. PALOMERA-ARIAS RANKING INDEX METHODS, EPA S ENERGY STAR Palomera-Arias et al. proposed a benchmarking system, called Ranking Index, that analyzes energy consumption for 49 schools in WCCUSD (39 elementary, 5 middle, and 5 high schools). The system uses multiple energy consumption indicators to find multiple school rankings. A Ranking Index was defined to represent all different sets of rankings. The Ranking Index number of each school was calculated by taking the average of the rank positions of each school under each indicator. For example, School A is ranked #4, #2, and #3 under Indicator a, b, and c respectively. Then, the Ranking Index number of School A is 3, which is the average of 4, 2, and 3. Eight indicators utilized were both absolute and relative values, which included total energy (BTU), total energy cost, energy (BTU) per student, energy cost per student, energy intensity (BTU/sq.ft.), cost per unit floor area, energy (BTU) per student-hour of operation, and energy intensity per hour of operation (BTU/sq.ft.-Hr). In addition to Ranking Index, the Energy STAR For School (K-12) benchmarking tools, by the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE), were used to compare the results with the Ranking Index system. The Energy STAR system is based on a scale of 1 to 100. The score is calculated based on a comparison to similar buildings (peers) from a national survey conducted by the DOE using statistically representative models. Buildings with an Energy STAR score of 75 or greater are considered the nationwide top 25 percent in terms of energy performance, and may qualify for the Energy STAR label. The information used in computation included building area, location, hours of occupation, months used, student population, percent of the floor area on cooling and heating, presence of cooking, and heating degree days [8]. The basic characteristics of the schools being evaluated are shown in Table 1. The annual energy consumption ( ) from both sources of energy, natural gas and electricity, was used in the analysis. The schools energy consumption characteristics are shown in Table 2. The results from Palomera-Arias paper are presented in the top part of Table 3 in section V. It was indicated that relative figures consumption or cost or student, were found to be better indicators of the schools performances. The authors also suggested that the results would help identify schools that would benefit the most from implementing energy conservation measures. However, it was also noted that the results were based on annual energy consumption data, which did not take into account the weather effects. The schools might be subject to microclimates even though they all are located within the same school district. IV. METHODOLOGY FRAMEWORK In this paper, the DEA methodology from Lee and Lee s paper was used to analyze the school data from Palomera- Arias. et al. s paper. However, the first step from Lee and Lee in using the regression analysis to determine climateadjusted energy consumption was omitted in this paper. The reason was to follow the original data paper by Palomera- Arias et al. that did not include microclimate adjustments. We assume that all schools would experience similar climate condition since they were located within the same school district, which covers an area of approximately 65 square miles in Contra Costa County northeastern of San Francisco Bay area. In the end, regression analysis was performed to find relationships among the DEA efficiency results, energy efficiency indicators, and other school data available in the original data paper. TABLE 1 SCHOOL CHARACTERISTICS School Type Counts Floor Area (sq.ft.) Ave. (Std. Deviation) Student Population Ave. (Std. Deviation) Elementary 39 43,690 (15,724) 499 (149) Middle 5 122,530 (28,673) 1,092 (121) High 5 185,657 (25,073) 1,537 (417) TABLE 2 SCHOOL ENERGY CONSUMPTION CHARACTERISTICS School Type Electr. Contr. Electr. Cost Contr. kbtu/ student Cost $/ student kbtu / sq.ft. Cost $/ sq.ft. kbtu/ sq.ft.-hr kbtu/ person-hr Elementary 44% 76% 2,682 $ $ Middle 36% 72% 3,470 $ $ High 52% 82% 5,106 $ $ All 44% 77% 3,010 $ $ Note: All numbers represent average values. 2868

3 A. The DEA models The input-oriented DEA model is used to minimize the input (energy consumption) while still providing at least the same level of service. In this paper, we strictly maintained the same input (BTU) and outputs (floor area and number of students) used in Lee and Lee s paper even though there were more useful data fields the school data from the original data paper by Palomera-Arias. et. al offered more possibilities for other inputs and outputs. A schematic for the DEA model is shown in Figure 1. Energy consumption Figure 1 DEA Model Schematic Showing Input/Output The DEA models to determine the overall efficiency (OE), pure technical efficiency (PTE), and scale efficiency (SE) are described below. 1. Constant returns to scale (CRS) model to find overall efficiency (OE) The CRS model is the standard DEA model introduced by Charnes, Cooper, and Rhodes in 1978 [1]. The model is also widely called as CCR model representing the names of the authors. The main concept of this model is that the inputs and the outputs are expected to have scalable relationship. For example, doubling the inputs for a decision making unit (DMU) implies that the expected outputs should also be doubled. The efficiency results of the CRS model yield the overall efficiency (OE) defined in Lee and Lee s paper. The input-oriented CRS model is expressed in linear programming equation below. For each DMU O: min,, s.t. n j 1 n j 1 y y x r, j i, j 0. DEA Model r,0 x j j i,0 Floor area, r 1,, s, i 1,, m Number of students Where is the DEA efficiency of each DMU, x i is the matrix of input i (from m types of inputs), x i,o is the input i of the DMU under consideration, y r is the matrix of output r (from s types of outputs), y r,o is the output r of the DMU under consideration, and is the vector of non-negative weights or intensity variables. 2. Variable returns to scale model (VRS) to find pure technical efficiency (PTE) The VRS model was invented by Banker, Charnes, and Cooper in 1984 [7] and is widely called BCC model. The model added an additional constraint to the CRS model to limit to be equal to 1. This additional constraint makes the VRS model to be different from the CRS model in the way that doubling the inputs for a DMU does not necessarily imply that expected outputs should also be doubled. And the efficiency results yield the pure technical efficiency (PTE) used by Lee and Lee s paper. 3. Scale efficiency (SE) It is noted that the PTE was achieved from separating scale efficiency (SE) from the OE. The relationship among OE, PTE, and SE is shown below. Overall Efficiency = [Pure Technical Efficiency] x [Scale Efficiency] B. Determining efficiency results This paper utilized OpenSolver Version 0.982, an Excel add-in that extends Excel s built-in Solver with a more powerful linear programming engine [9], to solve the linear programming problems of CRS and VRS models described in the previous sections. As a result, the OE and PTE were obtained directly from the linear programming outputs. Then, SE was calculated accordingly. C. Correlation analysis Correlation analysis was performed using the SPSS software package to find relationships between the DEA efficiency results, various efficiency indicators, and other school data fields. V. RESULTS A. Comparison to Lee and Lee s paper The distribution of the schools PTE results is consistent with the distribution presented by Lee and Lee in the way that buildings with effective energy management (PTE =1.0) accounted for about 10 to 15 percent and that majority fell between 40 to 80 percent efficiency. See Figure 2 for the comparison of the distributions of PTE. Figure 2 Comparison of Pure Technical Efficiency Distribution Considering the relationships among OE, PTE, and SE, it was found that the trend lines from the schools DEA efficiency results are consistent with the trends identified by Lee and Lee as shown in Figure 3 to Figure 5. Note that only 2869

4 trend lines are compared since regression coefficients were not reported in the original paper by Lee and Lee. In addition, the school results are consistent with the original paper by Lee and Lee that majority of schools had high scale efficiency of over 0.70 meaning that inefficiency would be a result of poor energy management. Pure Technical Efficiency y = x R² = Figure 3 Relation between pure technical efficiency and overall efficiency Scale Efficiency Overall Efficiency y = x R² = Overall Efficiency Figure 4 Relation between scale efficiency and overall efficiency y = 0.31x R² = Figure 5 Relation between pure technical efficiency and scale efficiency B. Comparison to Palomera-Arias et al. s paper This section compares our DEA analysis results to the results presented in the original paper by Palomera-Arias. et. al. Table 3 shows 6 indicators used by the original data paper to report the best and the worst performers and our comparable DEA results. Prior to discussing the comparison results, it is worth noting the relationships found among the different DEA efficiency scores; OE, PTE, and SE. OE and PTE generally identified similar sets of best and worst elementary schools, with PTE identifying six more efficient schools and giving middle and high schools higher scores. The main difference between the OE and PTE results was that high and middle schools received poor OE results while getting higher PTE scores. This showed that middle and high schools benefited from the scale factor constraint in the variable returns to scale (VRS) DEA model. The sets of schools getting the best and the worst SE results varied from those of OE and PTE. As expected, it was found that high schools had low SE scores, which explained their low OE performance but much improved PTE scores. TABLE 3 DEA RESULTS COMPARISON AGAINST PALOMERA-ARIAS S PAPER Indicator Best Worst 1. Energy STAR > 10 schools with 95+ points All High Schools & Downer E. 2. Ranking Index Highland E., Ford E., Montalvin E., Stege E., and Fairmont E. De Anza H., Kennedy H., El Cerrito H., Verde E., and Downer E. 3. Energy (BTU) Highland E., Cesar Chavez E., Ohlone E., Downer E., Kennedy H., El Cerrito H. per student 4. Energy cost ($) Highland E., Ford E., Fairmont E., Kennedy H., Richmond H., Downer E. per student 5. Energy (BTU) Highland E., Stege E., Montalvin E., Downer E., Murphy E., Verde E. 6.Energy ($) Montalvin E., Helms E., Fairmont E., Hercules E., Richmond H., Pinole Valley H. 7. Overall efficiency Highland (1.0), Stege E., Montalvin E. Downer E., Murphy E., Verde E., all HS < 0.5, all MS < Pure technical Grant E., Highland E., Montalvin E., Stege E., Pinole Valley Downer E., Murphy E., Verde E. efficiency H., Richmond H., Helms M. (all 1.0) 9.Scale efficiency Highland E., Hanna Ranch E., Riverside E., Kensington E. (14 schools over 0.95) Pinole Valley H., Richmond H., Kennedy H., Hercules E. Note: Results for indicators 1-6 (in Italic) were presented in the original paper by Palomera-Arias et. al. Pure Technical Efficiency Scale Efficiency 2870

5 1. Ranking Index: It was found that sets of the best/worst performers identified by OE and PTE were consistent with the Ranking Index results. There was one exception that high schools were rated better by PTE scores and then not ranked in the bottom three. Some name differences were influenced by the inclusion of much wider range of indicators by the Ranking Index system. Note that the DEA methodology only included three factors; energy consumption, floor area, and student population. Correlation analysis shows that OE has a very strong negative relationship with Ranking Index numbers, while PTE and SE have moderately strong negative correlations with the Ranking Index. See Table 4 for correlation results. 2. Energy STAR Score: The Energy STAR results were very much consistent with those of the Ranking Index system. Thus, the same observation as of the Ranking Index in terms of the best/worst performers was found. The same penalty to the scale of high schools was also observed. Note that the Energy STAR system identified more than 10 schools having a score of more than 95 (out of 100). This meant these schools had relatively the same efficiency compared to the nationwide peer group. However, the Ranking Index system was based on the much smaller sample size and the results could not translate well to the relative efficiency among the schools being evaluated. Correlation analysis shows OE has a strong relationship with Energy STAR scores, while PTE has a weak correlation with Energy STAR scores. The result also shows that SE has moderate correlation with Energy STAR score. 3. Other Relative Indicators: It was found that the best/worst performers from the perspective of DEA s OE are consistent with the results from the energy intensity perspective (Indicator #5 in BTU/sq.ft.). As mentioned earlier that the PTE added 6 efficient schools to the OE results, three of them were already included in the OE results and some of the results from the original paper and four of them (Grant elementary, Pinole Valley high, Richmond high, and Helms middle schools) were not included in any group of the best performers in the original paper. Correlation analysis shows that OE has strong negative correlations with Indicators #3 and #5. OE s strongest correlation is with Indicator #5 (BTU/sq.ft.), which confirms the above comparisons of the best/worst performers. As expected, PTE correlations with those indicators are less significant. PTE has its strongest correlations Indicator #5 and OE. C. Extended correlation results & discussions In addition to the straight comparisons to both original papers, we conducted a correlation analysis for the energy performance indicators and other school data fields available from the original data paper. Correlation results are shown in Table 4 and Table 5. As shown in Table 4, the correlations were determined for the DEA efficiency results and the indicators used in the original data paper to find trends and agreements among them. Among all indicators (#1 to #6) having moderate to strong correlations with OE, Ranking Index (#2) and energy per unit area (#5) were found to be the strongest with coefficient more than 0.9. In addition, OE appeared to have moderate correlation with PTE. From the PTE perspective, moderately high correlations were found with energy (#5) and OE. Overall, PTE has weaker correlations with other performance indicators than OE. Also, PTE has a very weak correlation with Energy STAR score (#1). From the SE perspective, a moderate correlation was found with Energy STAR score. Other indicators were found to have weak correlations, with correlation coefficient of less than 0.5. Since we followed the DEA methodology used by Lee and Lee, only 3 fields of data (energy consumption in BTU, floor area, and student population) were used and others were omitted from our analysis. Most of these omitted data fields were, in some degree, incorporated into the Ranking Index and Energy STAR systems. These fields included school hours and months of operation, percent area of air conditioning, the contribution of electricity to all energy consumption, and the cost of energy. TABLE 4 CORRELATIONS BETWEEN DEA RESULTS AND RELATIVE INDICATORS USED BY PALOMERA-ARIAS ET AL. 1. Energy STAR 2.Ranking Index 3.Energy (BTU) per student 4.Energy cost ($)per student 5.Energy (BTU) 6.Energy ($) 7.Overall efficiency (OE) 8.Pure technical efficiency (PTE) 9.Scale efficiency (SE) Overall efficiency (OE) Pure technical efficiency (PTE) Scale efficiency (SE)

6 TABLE 5 CORRELATIONS BETWEEN VARIOUS EFFICIENCY RESULTS AND SCHOOL CHARACTERISTICS AND ABSOLUTE INDICATORS Energy STAR Ranking Index Area (sf) Students Hrs/Wk Mth/Yr % AC Area Electricity BTU Contribution Electricity Cost Contribution Total BTU Total Energy Cost Energy STAR Ranking Index Overall efficiency (OE) Pure technical efficiency (PTE) Scale efficiency (SE) Thus, it is interesting to see the correlations between the energy efficiency indicators (shown in the most left column of Table 5) and those data fields, including the data fields used in our analysis. If a strong relationship is found for one data field, it might indicate that that data field should be accounted for in the benchmarking analysis either in the DEA analysis itself or in the pre-dea data adjustments. For example, like the weather adjustments Lee and Lee used, regression analysis could be used to adjust the school consumption data for school operation hours/months and percent air conditioning. Correlation results are shown in Table 5 and discussed below. Strong correlations are found between the Energy STAR score and the data fields used as input and outputs in the DEA models. There is also a moderate correlation with school months of operation. Other factors are found to have weak correlation with Energy STAR score. This shows that Energy STAR systems do not separate scale factors from the comparisons, which results in bigger schools like high schools get heavily penalized. For example, high schools by nature use more energy per student than elementary schools due to differences in school activities such as night sport events. Even though there are strong correlations among Energy STAR scores, Ranking Index numbers, and DEA s OE, the correlations between OE and other indicators are found weaker than those of Energy STAR and Ranking Index systems. Moderate (negative) correlations between SE and these absolute indicators also support the above trends that bigger schools tend to be scale inefficient. In other words, efficiency could be improved if the schools changed in size. This finding fits the results by Energy STAR and Ranking Index in the way that high schools and middle schools are generally less efficient than elementary schools. As expected, correlations between PTE and other absolute indicators are found to be weak. This is because the variable returns to scale assumption takes care of the unfair comparison between different school sizes. Note that there are moderate correlations between PTE and electricity contributions both in energy unit and in dollars. The same, but weaker, trend is also observed for OE and Energy STAR system. This could imply that the schools that have a higher electricity energy mix may have higher effectiveness in energy management. This is in contrast to the initial thought that follows the fact that electricity is more expensive per energy unit (BTU) than gas. No significant correlation is found between various school energy efficiency measurement systems and the school operational characteristics such as hours per week, months per year, and percent air conditioning area. This could be a result of the data was highly repetitive and most of the data points for elementary schools were assumed [6]. VI. FUTURE WORK Weather has a very important effect on the energy consumption of buildings and while this study attempted to control for this effect, further work could be done. Specifically, this study attempted to control for these effects by selecting buildings from a single school district. There may still be micro-climates such as hills and coastal areas that could result in important differences. The weather-adjusted consumption data could be used to feed the DEA models used in this paper. The results can then be compared against the results in this paper to see how sensitive the results would be to the micro-climate effects. In addition to the micro-climate adjustment, other school characteristics, such as year built, operation hours, and percentage of air conditioned area, could be used to adjust the energy consumption data. Also, the effects of the contribution ratio of electricity and gas should be further investigated to find meaningful conclusions. Since electricity and gas cost differently per unit energy, we should be able to incorporate the cost difference into the benchmarking practice. The DEA allocative efficiency technique could be explored to determine the efficiency of the choice of energy resource mix [10]. 2872

7 VII. CONCLUSION Comparison of our results to the original studies gives valuable understanding of the use DEA methodology to measure effectiveness of building energy management. The results from this paper confirm the distribution and relationships presented by the original paper by Lee and Lee. The scale efficiency (SE) of most schools is high meaning that the inefficiency, when it occurs, could be a result of poor energy management. The Ranking Index benchmarking system suggested by Palomera-Arias et. al. and Energy STAR score created by US EPA could not effectively account for the difference in the scale of schools and that directly affected the efficiency results of middle and high schools. With this effect, high schools are less likely to qualify for DOE s Energy STAR label. In contrast, the use of DEA pure technical efficiency (PTE), enabled by the concept of variable returns to scale (VRS), can be used to solve the scaling issue. It would enable fair comparisons among comparable schools. However, the Ranking Index system and Energy STAR score are still considered useful if the users are aware of the scaling issue. In fact, Energy STAR score system is very useful when a single school would like to know how effective it is in energy management and the availability of other schools data is limited. If the school data is available, the Ranking Index system and the DEA s overall efficiency (OE) could be used interchangeably given that all schools are comparable in size. Overall efficiency (OE), however, has some advantage over the Ranking Index system in the way that it gives relative efficiency value and tells the magnitude of efficiency while Ranking Index could not. For example, having Ranking Index number of 10 does not imply that it is twice as efficient as having Ranking Index number of 20. On the other hand, the DEA efficiency of 0.9 means that it is 10% inefficient compared to the efficient unit. Then, the use of each benchmarking technique depends on the availability of data and the level of efforts the users would like to have. If the effort is in the school district level or larger, the DEA framework used in this paper is considered a good candidate for benchmarking effectiveness of school energy management. REFERENCES [1] A. Charnes, W.W. Cooper, and E. Rhodes, Measuring the efficiency of decision making units, European Journal of Operational Research, vol. 2, 1978, pp [2] E.L. Rhodes, Data Envelopment Analysis and Approaches For Measuring the Efficiency of Decision-Making Units With an Application to Program Follow-Through in U. S. Education, Carnegie Mellon University, [3] W.W. Cooper, L.M. Seiford, and J. Zhu, Handbook on Data Envelopment Analysis, Dordrecht: Springer, [4] W.-S. Lee and K.-P. Lee, Benchmarking the performance of building energy management using data envelopment analysis, Applied Thermal Engineering, vol. 29, Nov. 2009, pp Available at: [Accessed June 14, 2011]. [5] W.-S. Lee, Benchmarking the energy efficiency of government buildings with data envelopment analysis, Energy and Buildings, vol. 40, Jan. 2008, pp Available at: [Accessed June 14, 2011]. [6] R. Palomera-arias, L.K. Norford, and B.T. Program, School energy use benchmarking and monitoring in the West Contra Costa Unified School District, Energy, 2002, p. 55. [7] R.D. Banker, A. Charnes, and W.W. Cooper, Some models for estimating technical and scale inefficiencies in data envelopment analysis, Management Science, vol. 30, 1984, pp [8] (Environmental Protection Agency), Portfolio Manager Overview: ENERGY STAR Available at: rtfoliomanager#rate [Accessed June 14, 2011]. [9] OpenSolver.org Available at: [Accessed June 14, 2011]. [10] J. Sengupta, Testing allocative efficiency by data envelopment analysis, Applied Economics Letters, vol. 5, 1998, pp