MSU Scorecard Analysis - Executive Summary #2

Similar documents
Internal Construction Management at Michigan State University

Super-marketing. A Data Investigation. A note to teachers:

ENVIRONMENTAL FINANCE CENTER AT THE UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL SCHOOL OF GOVERNMENT REPORT 3

STAT 2300: Unit 1 Learning Objectives Spring 2019

11-1 Descriptive Statistics

Math 1 Variable Manipulation Part 8 Working with Data

Math 1 Variable Manipulation Part 8 Working with Data

Chapter 2 Part 1B. Measures of Location. September 4, 2008

STAT/MATH Chapter3. Statistical Methods in Practice. Averages and Variation 1/27/2017. Measures of Central Tendency: Mode, Median, and Mean

Chapter 3. Displaying and Summarizing Quantitative Data. 1 of 66 05/21/ :00 AM

AP STATISTICS 2015 SCORING GUIDELINES

AP Statistics Test #1 (Chapter 1)

Outliers and Their Effect on Distribution Assessment

SAMPLE REPORT. Desktop Support Benchmark. Outsourced Desktop Support DATA IS NOT ACCURATE!

SAMPLE REPORT. Desktop Support Benchmark DATA IS NOT ACCURATE! Outsourced Desktop Support

A is used to answer questions about the quantity of what is being measured. A quantitative variable is comprised of numeric values.

Fittleworth Dispensing Appliance Contractor Customer Feedback Report

Customer Satisfaction Survey Report Guide

Advanced Higher Statistics

Sample Exam 1 Math 263 (sect 9) Prof. Kennedy

3.2 Measures of Central Tendency

Effective Applications Development, Maintenance and Support Benchmarking. John Ogilvie CEO ISBSG 7 March 2017

(Online Procurement Initiative)

Statistics Chapter Measures of Position LAB

Project vs Operation. Project Constraints. Pankaj Sharma, Pankaj Sharma,

Module - 01 Lecture - 03 Descriptive Statistics: Graphical Approaches

Section 3.2 Measures of Central Tendency MDM4U Jensen

Chapter 1 Data and Descriptive Statistics

Test Name: Test 1 Review

Elementary Statistics Lecture 2 Exploring Data with Graphical and Numerical Summaries

APPLYING SCIENTIFIC CONSIDERATIONS AND STATISTICAL APPROACHES IN ANALYTICAL SIMILARITY ASSESSMENT

Happyville. Kevin S. Robinson, PhD

2. What is the problem with using the sum of squares as a measure of variability, that is, why is variance a better measure?

Data Visualization. Prof.Sushila Aghav-Palwe

Measurement and sampling

BAR CHARTS. Display frequency distributions for nominal or ordinal data. Ej. Injury deaths of 100 children, ages 5-9, USA,

Happyville: Putting A Smile Into Statistical Ideas

Dr. Allen Back. Aug. 26, 2016

THE EFFECTS OF FULL TRANSPARENCY IN SUPPLIER SELECTION ON SUBJECTIVITY AND BID QUALITY. Jan Telgen and Fredo Schotanus

Bar graph or Histogram? (Both allow you to compare groups.)

Ten Things You Must Do to Win. Presented by David Bol

Lecture 10. Outline. 1-1 Introduction. 1-1 Introduction. 1-1 Introduction. Introduction to Statistics

Statistics is the area of Math that is all about 1: collecting, 2: analysing and 3: reporting about data.

Statistics 201 Summary of Tools and Techniques

Fundamental Elements of Statistics

SAMPLE REPORT. Desktop Support Benchmark DATA IS NOT ACCURATE! In-house/Insourced Desktop Support

Observations from the Winners of the 2013 Statistics Poster Competition Praise and Future Improvements

Statistical Observations on Mass Appraisal. by Josh Myers Josh Myers Valuation Solutions, LLC.

1. Contingency Table (Cross Tabulation Table)

Our objectives today will be to review Outcomes Report layout and how to use the metrics to gauge how your site is doing in relation to all of the

AP Statistics Scope & Sequence

MAS187/AEF258. University of Newcastle upon Tyne

Creative Commons Attribution-NonCommercial-Share Alike License

VIII. STATISTICS. Part I

FACILITIES AND INFRASTRUCTURE REPORT. January 2012

Chapter 5. Statistical Reasoning

CHAPTER 4 A FRAMEWORK FOR CUSTOMER LIFETIME VALUE USING DATA MINING TECHNIQUES

AP Statistics Part 1 Review Test 2

Applying Regression Techniques For Predictive Analytics Paviya George Chemparathy

Honorable Mayor and Members of the City Council

Relationship between Design-Build Procurement Duration and Project Success

Procedure Yes; No; greater than

Authorized Service Vendors in Mobile Terminals Voice

SOA Maturity Model - Guiding and Accelerating SOA Success. An Oracle White Paper September 2008

CHAPTER-IV DATA ANALYSIS AND INTERPRETATION

Coal Combustion Residual Statistical Method Certification for the CCR Landfill at the Boardman Power Plant Boardman, Oregon

Objective Standards Of Performance

Applying PSM to Process Measurement

Biostatistics 208 Data Exploration

POLICY FOR SELECTING VENDORS OF NON-CONSTRUCTION PURCHASING CONTRACTS, PROFESSIONAL SERVICES AND DELEGATION OF IMPLEMENTATION AUTHORITY

PLANNING. Ahmed Elyamany, PhD

Project Delivery. Richard D. Woods. Delivery: ipod or New Coke? $64,000 Question. Case Law Say? JULY // AUGUST GeoLegend:

The Service Desk Balanced Scorecard By Jeff Rumburg. Managing Partner at:

36.2. Exploring Data. Introduction. Prerequisites. Learning Outcomes

Topic 1: Descriptive Statistics

Adding Value to Change Order Management Process Using Simulation Approach: A Case Study

The Balanced Scorecard

Statistics Chapter 3 Triola (2014)

CDG1A/CDZ3A/CDC3A/ MBT3A BUSINESS STATISTICS. Unit : I - V

Committee Meeting. Procurement Processes for Professional Services and Contractors

CDG1A/CDZ3A/CDC3A/ MBT3A BUSINESS STATISTICS. Unit : I - V

EVALUATION OF CONSTRUCTION COST AND TIME ATTRIBUTES

Call Center Benchmark India

STATISTICALLY SIGNIFICANT EXCEEDANCE- UNDERSTANDING FALSE POSITIVE ERROR

Name: Class: Date: 1. Use Figure 2-1. For this density curve, what percent of the observations lie above 4? a. 20% b. 25% c. 50% d. 75% e.

CHAPTER 1 INTRODUCTION

Solved Scanner CS Foundation Paper-2, June 2009 Economics and Statistics Part-A ECONOMICS

Five out of six NA Automakers Scores drop in Annual Supplier Working Relations Study

Ch. 16 Exploring, Displaying, and Examining Data

Analysis of monitoring data on business development support to small and medium enterprises ( ) - PUM

GLOSSARY OF COMPENSATION TERMS

IT portfolio management template User guide

1/26/18. Averages and Variation. Measures of Central Tendency. Focus Points. Example 1 Mode. Measures of Central. Median, and Mean. Section 3.

QUESTION 2 What conclusion is most correct about the Experimental Design shown here with the response in the far right column?

Comparison of Three Turbidity Instruments DTS-12, YSI, and Hydrolab

Review Materials for Test 1 (4/26/04) (answers will be posted 4/20/04)

- Details of the item to be procured and whether it is for works, services or goods;

Procurement May 2018

Transcription:

MSU Scorecard Analysis - Executive Summary #2 MSU Center for Construction Project Performance Assessment and Improvement (C2P2ai) Funding Provided by MSU office of Vice President of Finance and Operations January 17, 2011 Kim Trong Nguyen, Graduate Research Assistant Tariq Abdelhamid, PhD, Associate Professor Tim Mrozowski A.I.A., Professor Construction Management School of Planning Design and Construction Michigan State University () Issued: 10/10/2010 Page 1 of 21

Section I: Executive Summary This report follows the previous report dated March, 2009, completed by the Center for Construction Project Performance Assessment and Improvement (C2P2ai), School of Planning, Design and Construction, Michigan State University. In the first report, data of the Contractor Scorecards and MSU Scorecards dated prior to March, 2009 were analyzed. By April, 2010, approximately 350 scorecards were collected, details on each type and collection period is show in Table 1 below. This report continues to analyze the Contractor scorecards based on the 2010 year data, and also with the combined data of 2009 and 2010. Due to the small number of completed scorecards for the MSU, A/E, and Customer scorecards, the data from 2009 and 2010 were combined to allow analysis. Table I.1: Scorecards collected by type and year Scorecard Contractor MSU A/E Customer 2009 120 11 2010 166 7 31 8 Total 286 18 31 8 Findings Summary Contractors Scorecard Combining all the data in the two years, 2009 and 2010, scheduling is still the most underperforming aspect and Project Management is the best performing. The overall average score is 2.71 with a standard deviation of ±0.60 (see Figure I.1; Four is the best possible score). Figure I.1: Frequency distribution for Contractor Scorecard averages (2009 and 2010 scores) Histogram of 25 20 Mean 2.708 StDev 0.5983 N 282 Frequency 15 10 5 0 1.2 1.6 2.0 2.4 2.8 3.2 3.6 Issued: 10/10/2010 Page 2 of 21

A summary of findings for Contractors Scorecards are as follows: Contractors were rated slightly higher (better) on interruptions than on timely closure of issues, which means that MSU s assessment is that contractors try to avoid negative impacts on MSU operations rather than tending to a timely closure of rework items. Scheduling received the lowest scores, and the highest standard deviation. The level of compliance with the MSU scheduling specifications is not rated on the scorecards, which prevents analyzing their impact. This could be considered for the 2011 scorecard. The 2009 and 2010 show contractors performing slightly better on estimating and costing change orders than processing the changes. MSU strongly believed contractors were committed and responsive. The coordination and RFI/Change Order process were close to the overall average of 2.71. Warranty performance was rated higher to than doing Punch List and Documentation. 2009 2010 MSU Score card Contractors believe MSU performed best on management aspects with an average score of 3.67, score is 2.71 with a standard deviation of ±0.40 (see Figure I.2; Four is the best possible score). Other factors related to the owner scorecard also received high scores, considering 4 is the best score. For example, the least performing factor was quality, but still received an average of 3.37. There were no outliers and no score under 2.0. In comparing to the 2009 results, the average total score in 2010 improved slightly, from 3.33 to 3.48 while the standard deviation almost remained unchanged. Figure I.2: Frequency distribution for Owner Scorecards averages (2009 and 2010 scores) Histogram of 4 3 Mean 3.480 StDev 0.3976 N 18 Frequency 2 1 0 2.6 2.9 3.2 3.8 A summary of findings for Owner Scorecards are as follows: Contractors gave bidding documents quality a lower score compared to others in the category of quality. Issued: 10/10/2010 Page 3 of 21

Timely decisions received the lowest score compared to others in the scheduling category. With a few exceptions, the majority of contractors think that MSU does a good job in paying the approved invoices within 30 days and is reasonable in dealing with changes. On the other hand, MSU got the worst score in timely processing of change orders MSU was scored high in the management category, with the smallest average score being 3.39 (out of 4). 2009 2010 A/E Score card Notwithstanding the limited pool of data for A/E scorecard (MSU assessment of A/E services), professional services received a very high score in every factor on the scorecard. The average score of all the factors is 3.71, with a standard deviation of ±0.35 (see Figure I.3; Four is the best possible score). The management process used by the A/E received the highest scores, as well as the communication process. Figure I.3: Frequency distribution for A/E Scorecards averages (2010 scores) 12 Histogram of Score Normal 10 Mean 3.706 StDev 0.3531 N 31 Frequency 8 6 4 2 0 2.7 2.9 3.1 3.3 Score 3.7 3.9 2009 2010 Customer Score card The end-user evaluation of the construction process, consider data from the 2009 and 2010 Customer Scorecard. Given the amount of data were only 8 scorecards, no statistical analysis was conducted. Overall, the customer rated all the indicators with a high score. Specifically, among 8 scorecards, where each scorecard has 11 indicators, for a total of 88 indicators, there was only one indicator rated with a score of 1; 3 indicators had a 2 score; and 12 indicators received a 3 score; and the remaining 72 indicators were assessed with the maximum score (4 out of 4). It is also interesting to note that 3 scorecards achieved maximum score for every indicator. Issued: 10/10/2010 Page 4 of 21

Conclusion and recommendation Continuing with the scorecards process will allow developing a database that can be used to identify vendors with the highest performance. Meanwhile, the return ratio of the Owner scorecards should be improved to an equivalent number of Contractor scorecards. It is our understanding that CPA and EAS will reconsider the time and frequency of completing the scorecards. Currently, the scorecards are completed at substantial completion, which misses the project closeout performance. In addition, consideration should be given as to whether quarterly evaluations would afford vendors, and MSU, the time to address issues reported in the scorecards. We also recommend that safety issues be considered and integrated into the scorecards. Refinement of the customer scorecard is under consideration, within the POE project currently underway at CPA. The analysis in this report reveals that among three analyzed scorecards, the professional services performance was evaluated as being excellent. On the other hand, MSU and Contractors performance slightly vary in each factor and indicators. Some factors and indicators achieved very high scores while others were evaluated at or below average. This points toward improvement opportunities in Contractors and MSU performance on construction projects, which was also pointed out in the first report. Improvement Opportunities: Contractors: o Schedule reports o Processing change orders o Closeout documentation MSU: o Quality of bidding documents o Timely decisions o Timely processing of change orders For next steps, the C2P2ai team will: Conduct interviews with contractors and designers to evaluate the process and whether a different form of feedback is warranted. work with CPA and EAS: o To implement modified scorecards for 2011. o To develop a follow-up process for low scoring indicators, specifically those on the Contractor scorecards. Some of this follow-up was conducted informally by Brad Bull. With Brad s reassignment to FRIB, and Leisa William-Swedberg s promotion to MSU Construction Superintendent, she will now be handling these scorecards. o Consider the frequency and timing of requesting scorecards. o Find an appropriate way to aggregate the data so that vendors can be given personalized scoring reports. Issued: 10/10/2010 Page 5 of 21

Section II: Detailed Scorecard Analysis and Findings Issued: 10/10/2010 Page 6 of 21

Section II: Detailed Scorecard Analysis and Findings The remainder of the report presents detailed results and is only meant to show the source of the findings in the Executive Section. Content-wise, the 2010 scorecards are the same as those used in 2009. Each type of scorecard consists of a number of critical success factors that MSU wished to track. For any of the factors on a scorecard, a relative weight was used to reflect a factor s contribution to the overall scores. Under each factor, multiple indicators were used to further define each of the success factors. These indicators were also assigned a second system of weight. Tables II.1 to II.4 provides all scorecards with a summary of all indicators with the double-weight system. It is worth noting that the performance indicators are measured by the reviewer either qualitatively or quantitatively and assigned a score between 1 to 4, with 1 representing poor performance and 4 representing excellent performance. For measuring each performance indicator, the scorecard suggests brief guidelines that direct the reviewer to any relevant data before making the decision. Scorecard Analysis In the first step of analysis, data is presented in a descriptive statistics table, which includes the mean value, the standard deviation, the minimum and maximum value, the median, the first and the third quartiles. These values provide a general picture of how the data are distributed. This is followed by box plots, produced as a graphic aggregation of all the results. Following statistical analysis, an in-depth analysis provides insight about the indicators in each factor. The average score of each indicator is compared and contrasted. This score is the mean score of the indicators for one type of scorecard in the period of analysis. For example, the average score of QL1, as an indicator for quality factor, for the 2010 contractor scorecard is provided based on 166 QL1data points. Finally, a histogram representing total average scores is introduced to reflect the spread in the indicators for each type of scorecard. It is important to note that the total average score for a specific factor is determined by incorporating the factor and the indicator weight. For instance, the total average score for the MSU scorecard is calculated by multiplying each indicator s score with its weight, then summing up all the results in each factor, incorporated by that factor s weight, and then dividing by the total indicator s weight to get the average score. Issued: 10/10/2010 Page 7 of 21

Table II.1: Scorecard structure (Contractor Scorecard) Score-card Factor Factor weight Indicator Indicator weight Short explanation QL1 6 Timely closure of rework items Quality 25% QL2 7 Interruptions to operations QL3 11 Qualitative rating SH1 4 Performance against milestones Schedule 20% SH2 4 Schedule SH3 4 Schedule reports Contractor (MSU evaluate Contractor) Cost 20% SH4 11 Qualitative rating CS1 11 Project changes CS2 8 Processing of changes MG1 4 Coordination Managem ent System 20% MG2 4 RFI/Change order Process MG3 4 Responsiveness MG4 4 Commitment to MSU MG5 4 Qualitative rating CL1 4 Punch list Close Out 15% CL2 7 Records drawing & documentation CL3 3 Warranty performance Issued: 10/10/2010 Page 8 of 21

Table II.2: Scorecard structure (Owner Scorecard) Score-card Factor Factor weight Indicator Indicator weight Short explanation QL1 5 Design management QL2 5 Quality bidding documents Quality 25% QL3 5 Understanding work QL4 2 Responsiveness QL5 5 Relative Quality Rating SH1 5 Timely decisions Schedule 15% SH2 3 Work by owner coordination SH3 2 Schedule reasonable MSU (Owner) Cost 25% SH4 5 Relative Schedule Rating CS1 8 Prompt payment CS2 5 Reasonableness with changes CS3 5 Timely processing of changes CS4 6 Relative Cost Rating (Contractor evaluate MSU) Management System 25% MG1 5 Fair and reasonable MG2 5 Contract terms & conditions MG3 5 Internal communications MG4 5 Fair procurement process MG5 5 Relative Management rating CL1 3 Punch list Close Out 10% CL2 4 Contract close out CL3 3 Relative Closeout rating Issued: 10/10/2010 Page 9 of 21

Table II.3: Scorecard structure (A/E Scorecard) Score-card Factor Factor weight Indicator Indicator weight Short explanation Bidding 25% BDD 1 Biddable documents A/E Professional Services Coordination 15% CDN 1 Information incorporate Communication 25% CMM 1 Responsiveness of A/E firm Management 10% MGT 1 Professional & fair manner Schedule 25% SCH 1 Meet design milestones Table II.4: Scorecard structure (Customer Scorecard) Score-card Factor Factor weight Indicator Indicator weight Short explanation Estimating 10% ES1 5 Estimator contact ES2 5 Estimator interaction DS1 10 Project requirements DS2 10 Design & budget review Design 50% DS3 10 Design expectations Customer DS4 10 Contribution during design Constructi on 40% DS5 10 Design schedule expectations CN1 10 Quality of work CN2 10 Professional Contractor CN3 10 Safe and clean work site CN4 10 Contractor interaction Issued: 10/10/2010 Page 10 of 21

2010 Contractor Scorecard For the 2010 scorecards, the following descriptive statistics are listed in Table II.5: Average Quality, Average Schedule, Average Cost, Average Management, Average Close-out, Total Average. Figure II.1 shows a box plot of each of the factors that make up the contractor scorecard. Table II.6: Descriptive Statistics (2010 Contractor scorecard) Variable Mean StDev Minimum Q1 Median Q3 Maximum Avg QL Avg SH Avg CS Avg MG Avg CL 2.9 2.3 2.9 2.9 2.7 2.7 0.7 0.7 0.7 0.6 0.6 0.5 1.0 0.4 0.0 1.0 0.0 1.2 2.4 1.9 2.6 2.7 2.3 2.5 2.5 2.9 2.8 3.3 2.7 3.3 3.1 3.4 3.7 Figure II.1: Box plot of the Average Scores of the Contractor Scorecards Factors 4 Boxplot of Avg QL, SH, CS, MG, CL, and ; N = 166 3 Data 2 1 0 Avg QL Avg SH Avg CS Avg MG Avg CL The line shown in the box plot figure connects all the means score under each category. On average, the Contractor gets the highest score on Management factor and lowest on Scheduling factor. The average total score was 2.7 (out of four) with a standard deviation of ±0.5. It is worth to note that there was a clustering of data under CS and MG. This cluster caused by their first quartiles Q1 and third quartile Q3 were too close, which means most of data of these factors fall into Q1 and Q3 range. This can be understood that, with a few exceptions, most of the contractors were evaluated around the median score of Cost and Management factors. Issued: 10/10/2010 Page 11 of 21

The in-depth analysis of each of the performance categories in the contractors scorecard shows the following trends: - Quality: Indicator QL1 QL2 QL3 Avg. each Indicator 2.82 2.86 2.91 On average contractors get a little higher score on Interruptions to Operations than the other indicators, which mean the contractors tried to perform better on avoiding negative impacts on MSU operation than timely finishing items. - Schedule: Indicator SH1 SH2 SH3 SH4 Avg. each Indicator 2.77 2.61 2.57 2.62 This factor gets the worst performance score on average. In particular, Schedule Reports receives a lower score on average. Performance against milestones is slightly better. In comparing to the 2009 results, the score under these Schedule categories still remain the lowest ones. - Cost: Indicator CS1 CS2 Avg. each Indicator 2.91 2.79 The contractors perform slightly better on estimating and costing change orders than processing of changes. This bodes well for the Contractors. - Management: Indicator MG1 MG2 MG3 MG4 MG5 Avg. each Indicator 2.82 2.83 1 3.13 2.78 Collectively, this is the factor with the better contractor performance. Contractors were assigned the highest score in Commitment to MSU indicator, specifically in appraising MSU of impending issues to budget and schedule. The other indicators for the Management factor also get high scores. Overall, MSU seemed to be most satisfied with the management factor. - Closeout: Indicator CL1 CL2 CL3 Avg. each Indicator 2.76 2.59 2.92 Issued: 10/10/2010 Page 12 of 21

The scores for the closeout indicate Contractors focus more on their punch-list than records drawing and documentation. However, the Contractors were still assigned a good score on warranty performance. Taking the average for all the five factors on the contractor scorecard and graphing results in the following histogram. This graph illustrates the score of the Contractors varies in a significant way, with the distribution more skewed to the right and one peak appears around the score. While we are reporting the average, it is more accurate to consider the median values because the distribution is not following a Normal distribution. Figure II.2: Frequency distribution for the average value for all five factor scores on the contractor scorecards (2010 scores) Histogram of. 18 16 14 Mean 2.736 StDev 0.5469 N 166 Frequency 12 10 8 6 4 2 0 0.88 1.32 1.76 2.20 2.64 Total A vg 8 2 3.96 2009 2010 Contractor Scorecard Considering data from the 2009 and 2010 Contractor scorecards, result in the following, as shown in Table II.6. Figure II.3 illustrates a box plot, similar to the one shown in Figure II.1. Combining all the data in the two years, 2009 and 2010, the shape of the line connecting all the mean scores remains practically the same as that for 2009 and 2010 when drawn separately. Again, the 2009 and 2010 combined data show that scheduling is still the most underperforming aspect and Project Management is the best performing. The overall average score is 2.71 with a standard deviation of ±0.598. Issued: 10/10/2010 Page 13 of 21

Table II.6: Descriptive Statistics (Contractor Scorecard 2009/2010) Variable Mean StDev Minimum Q1 Median Q3 Maximum Avg QL 2.8 0.7 1.0 2.3 3.1 Avg SH 2.6 0.8 0.5 2.0 2.8 Avg CS 2.7 0.7 0.0 2.0 Avg MG 2.8 0.6 1.0 2.4 3.2 Avg CL 2.6 0.6 0.0 2.0 2.8 2.7 0.6 1.0 2.2 2.8 3.1 3.9 Figure II.3: Box plot of the Average Scores of the 2009/2010 Contractor Scorecards Factors 4 Boxplot of Avg QL, SH, CS, MG, CL, and ; N = 286 3 Data 2 1 0 Avg QL Avg SH Avg CS Avg MG Avg CL The in-depth analysis below compares indicators under each aspect based on the data of the two years. - Quality: Indicator QL1 QL2 QL3 Avg. each Indicator 2.71 2.77 2.77 The aggregate data for 2009 and 2010 indicate that contractors were rated slightly higher (better) on interruptions than on timely closure of issues, which means that MSU s assessment is that contractors try to avoid negative impacts on MSU operations rather than tending to a timely closure of rework items. Overall, quality as a factor received a 2.76 average score (out of ). Issued: 10/10/2010 Page 14 of 21

- Scheduling: Indicator SH1 SH2 SH3 SH4 Avg. each Indicator 2.65 2.51 2.49 2.54 Once again, this factor is the lowest score. The highest indicator is performance against milestones, and the lowest is schedule reports. It is important to note that this factor had the highest standard deviation, at ±0.782. - Cost: Indicator CS1 CS2 Avg. each Indicator 2.80 2.68 The 2009 and 2010 show contractors performing slightly better on estimating and costing change orders than processing the changes. - Management: Indicator MG1 MG2 MG3 MG4 MG5 Avg. each Indicator 2.70 2.73 2.92 1 2.70 Over the 2009 and 2010 period, Management was still the best factor with the highest average scores. Among them, MSU strongly believed contractors were strongly committed to MSU. Responsiveness also received a rather good score, nearly as well as commitment. Coordination and RFI/Change Order Process are fairly good indicators. - Project Closeout: Indicator CL 1 CL 2 CL 3 Avg. each Indicator 2.62 2.53 2.81 The Contractors were rated better on Warranty Performance than doing Punch List and Documentations. Figure II.4 shows a histogram of the average for all five factors scores on the contractor scorecards. The histogram illustrates the score of the Contractors varies in a significant way, with a bimodal distribution, characterized by the two peaks. One appears around the 2.0 score and the other around. This shape implies that the total average score dropped in 2010. Issued: 10/10/2010 Page 15 of 21

Figure II.4: Frequency distribution for the average value for all five factor scores on the Contractor Scorecards (2009 and 2010 scores) Histogram of 25 20 Mean 2.708 StDev 0.5983 N 282 Frequency 15 10 5 0 1.2 1.6 2.0 2.4 2.8 3.2 3.6 2009 2010 MSU Score card The contractors evaluation of MSU, considering data from the 2009 and 2010 Owner Scorecard, results in the following, as shown in Table II.7. Figure II.5 illustrates a box plot, similar to the one shown in the Figure II.1. Table II.7: Descriptive Statistics (MSU Owner Scorecards 2009/2010) Variable Mean StDev Minimum Q1 Median Q3 Maximum Avg QL 3.4 0.6 2.1 2.8 3.6 3.8 Avg SH 3.4 0.6 2.4 2.8 Avg CS 3.4 0.5 2.2 3.4 3.8 Avg MG 3.7 0.3 3.4 3.6 Avg CL 3.6 0.4 3.3 3.6 0.4 2.6 3.2 3.9 The average total score was 3.48 with a standard deviation of ±0.397. As shown in the Figure 5, the contractors believe MSU performed best on the management aspect with an average score of 3.67. The other factors also received high scores. The least performing factor was considered the quality aspect, but still getting an average of 3.37. It is worth to note that there were no outliers and no score under 2.0. In comparing to the 2009 results, the average total score in 2010 improved slightly, from 3.33 to 3.48 while the standard deviation almost remained unchanged. Issued: 10/10/2010 Page 16 of 21

Figure II.5: Box plot of the Average Scores of the MSU Scorecards Factors Boxplot of Avg QL, SH, CS, MG, CL, and ; N = 18 Data 2.5 2.0 Avg QL Avg SH Avg CS Avg MG Avg CL Following is the in-depth analysis of each indicator within each aspect. - Quality: Indicator QL1 QL2 QL3 QL4 QL5 Avg. each Indicator 6 2.89 3.78 3.44 3.72 On average, MSU was rated highest score in understanding the work, which means that contractors believe MSU was very knowledgeable about the work being performed. On the other hand, MSU received the lowest score in the quality of bidding documents indicator, which may indicates there are ambiguities or questions which contractors had about the documents. - Schedule Indicator SH1 SH2 SH3 SH4 Avg. each Indicator 3.11 3.44 3.17 3.67 MSU received the same average score on the scheduling factor that it did on quality. Both were the lowest scores of the MSU scorecards, but again a 3 out of 4 is really not low in absolute terms. Contractors rated coordination of work done by owner as one of the highest in the scheduling factor. Timely decisions and reasonableness of specifics schedule indicators were rated almost the same scores. - Cost: Indicator CS1 CS2 CS3 CS4 Avg. each Indicator 0 3.61 2.78 3.67 Issued: 10/10/2010 Page 17 of 21

Similar to last year results, MSU is rated with a high variation in the cost category. With a few exceptions, the majority of contractors think that MSU does a good job in paying the approved invoices within 30 days and is reasonable in dealing with changes. On the other hand, MSU got the worst score in timely processing of change orders. - Management: Indicator MG1 MG2 MG3 MG4 MG5 Avg. each Indicator 3.67 3.61 3.39 3.83 3.83 This received the highest score, similar to last year s score. On average, all the indicators in this factor were rated very high score. The least score, at 3.39 (out of 4) is contract terms and conditions. Almost all contractors rated MSU the maximum score in procurement process. - Close out: Indicator CL1 CL2 CL3 Avg. each Indicator 3.61 6 3.72 MSU was scored nearly the same to the 2009 scorecards with respect to handling punch list items and contract closeout indicators. Figure II.6 is a histogram of the average for all five factors scores on the Owner Scorecard. Overall, MSU was rated to be significantly high in all scorecard factors. The distribution is skewed to the left, with the center around. Figure II.6: Frequency distribution for the average value for all five factor scores on the Owner Scorecards Histogram of 4 3 Mean 3.480 StDev 0.3976 N 18 Frequency 2 1 0 2.6 2.9 3.2 3.8 Issued: 10/10/2010 Page 18 of 21

2009 2010 A/E Score card The Professional services evaluation of MSU, considering data from the 2009 and 2010 A/E Scorecard, results in the following, as shown in Table II.8. Figure II.7 illustrates a box plot, similar to the one shown in Figure II.1. Table II.8: Descriptive Statistics (A/E Scorecards 2009/2010) Variable Mean StDev Min Q1 Median Q3 Max BDD Score 0.6 2.0 CDN Score 0.5 2.0 CMM Score 3.8 0.4 2.5 MGT Score 3.9 0.3 SCH Score 3.8 0.4 Score 3.7 0.4 2.9 3.6 3.9 3.9 Figure II.7: Box plot of the Scores of the A/E Scorecards Factors Boxplot of BDD, CDN, CMM, MGT, SCH, and Score; N = 31 Data 2.5 2.0 BDD Score CDN Score CMM Score MGT Score SCH Score Score As shown in the descriptive statistic table and the box plot, professional services received a very high score in every factor. The average score of all the factors is 3.71 out of 4. The management process used by the A/E received the highest scores. It is interesting to note that, statistically, communication and management factors almost received the absolute score (4 out of 4), with only a few outliers. This can be seen by a cluster of CMM and MGT in the box plot (Figure II.7) and also the first quartiles, the medians, and the Issued: 10/10/2010 Page 19 of 21

third quartiles in the descriptive statistic (Table II.9), which were all equal to 4. Besides, no single factor received a below-average score, for any of the A/E scorecards analyzed. Table II.9: Average score of factors of A/E scorecards Avg. score BDD CDN CMM MGT SCH Total Score (w/ weight) 5 0 3.84 3.87 3.79 3.71 Figure II.8 is a histogram of the average for all five factor scores on the A/E Scorecard. Overall, MSU rated Professional Services to be excellent (3.71 out of 4 is an average score). The distribution is approximately normal, with the center around 3.7. Figure II.8: Frequency distribution for the average value for all five factor scores on the A/E Scorecards 12 Histogram of Score Normal 10 Mean 3.706 StDev 0.3531 N 31 Frequency 8 6 4 2 0 2.7 2.9 3.1 3.3 Score 3.7 3.9 2009 2010 Customer Score card The end-user evaluation of the construction process, consider data from the 2009 and 2010 Customer Scorecard. Given the amount of data were only 8 scorecards, no statistical analysis was conducted. Overall, the customer rated all the indicators with a high score. Specifically, among 8 scorecards, where each scorecard has 11 indicators, for a total of 88 indicators, there was only one indicator rated with a score of 1; 3 indicators had a 2 score; and 12 indicators received a 3 score; and the remaining 72 indicators were assessed with the maximum score (4 out of 4). It is also interesting to note that 3 scorecards achieved maximum score for every indicator. Issued: 10/10/2010 Page 20 of 21

Bibliography Attalla M. et al. (2003). Reconstruction of the Building Infrastructure: Two Performance Prediction Models. Journal of Construction Engineering and Management, Vol. 9, No. 1, March, 2003. Ballard H. (2000), Last Planner System of Production Control. Doctoral Thesis, University of Birmingham, May 2000. Chan, A.P.C. (2001) Framework for Measuring Success of Construction Projects, Research Program C: Construction Project Delivery Strategies, Report 2001-003-C-01 Fellows, R. and Liu A. (2003), Research Methods for Construction, 2nd Edition, Blackwell Publishing, Malden, MA Russell, J.S. (1992), Decision models for analysis and evaluation of construction contractors, Construction Management, 10, 185 202. Hughes, S.W. (2004). Measuring Project Success in the Construction Industry, Engineering Management Journal, Vol. 16 No. 3, Sep 2004. Issued: 10/10/2010 Page 21 of 21