IT Service and Support Benchmark

Size: px
Start display at page:

Download "IT Service and Support Benchmark"

Transcription

1 IT Service and Support Benchmark For Higher Education Information Briefing

2 Your Speaker: Jeff Rumburg Co Founder and Managing Partner, MetricNet, LLC Winner of the 2014 Ron Muns Lifetime Achievement Award Former CEO, The Verity Group Former Vice President, Gartner Founder of the Service Desk Benchmarking Consortium Author of A Hands-On Guide to Competitive Benchmarking Harvard MBA, Stanford MS 1

3 IT Service and Support Benchmark For Higher Education Information Briefing

4 About MetricNet: Your Benchmarking Partner

5 Benchmarking is MetricNet s Core Business Information Technology Call Centers Telecom Service Desk Desktop Support Field Services Technical Support Customer Service Telemarketing/Telesales Collections Cost Benchmarking Satisfaction Customer Satisfaction Employee Satisfaction 4

6 27 Years of Service and Support Benchmarking Data More than 3,700 Service & Support Benchmarks Global Database 30 Key Performance Indicators Nearly 80 Industry Best Practices 5

7 More than Half the FORTUNE 500 Use MetricNet Benchmarks MetricNet Conducts benchmarking for IT Service and Support organizations worldwide, and across virtually every industry sector. 6

8 IT Service and Support Benchmark for Higher Education Project Overview

9 Why an IT Service and Support Benchmark for Higher Education? Industry Demand is huge Unique characteristics of IT Service and Support in this vertical market Student support model Wide spectrum of technologies to support Customers are broadly distributed geographically Service levels are critical 8

10 Project Timeline Sign-up and Registration Phase Benchmarking Report Produced by MetricNet April May June July August Data Collection Documents Due June 30, 2016 Presentation of Results 9

11 The Benchmarking Process Module 1: Baselining / Data Collection Module 2: Benchmarking and Gap Analysis Module 3: Balanced Scorecard Module 4: Best Practices Process Self Assessment Module 5: Strategies for Improved Performance Module 6: Report Development and Presentation of Results 10

12 Module 1: Baselining/Data Collection Core Topics Project Kickoff Data Collection 11

13 Individualized Project Kickoff Meeting XYZ University Key Objectives: Project Kickoff Meeting Introduce the MetricNet and XYZ University project teams Discuss the project schedule Distribute the data collection document Answer questions about the project 12

14 Data Collection 13

15 Module 2: Benchmarking and Gap Analysis Core Topics Benchmarking Comparison Gap Analysis 14

16 The Benchmarking Methodology XYZ University Service and Support Performance COMPARE Performance of Benchmarking Peer Group Determine How Best in Class Achieve Superiority Adopt Selected Practices of Best in Class Build a Sustainable Competitive Advantage The ultimate objective of benchmarking Read MetricNet s whitepaper on Service Desk Benchmarking. Go to to receive your copy! 15

17 Summary of Included Service Desk Benchmarking Metrics Cost Quality Productivity Cost per Inbound Contact Cost per Minute of Inbound Handle Time Price per Inbound Contact Price per Minute of Handle Time Customer Satisfaction First Contact Resolution Rate Call Quality Inbound Contacts per Technician per Month Technician Utilization Technicians as a % of Total Headcount First Level Resolution Rate Technician Service Level Call Handling Annual Technician Turnover Daily Technician Absenteeism Schedule Adherence New Technician Training Hours Annual Technician Training Hours Average Speed of Answer (ASA) % of Calls Answered in 30 seconds Call Abandonment Rate Inbound Contact Handle Time User Self-Service Completion Rate Technician Tenure Technician Job Satisfaction 16

18 Summary of Included Desktop Support Benchmarking Metrics Cost Cost per Ticket Cost per Incident Cost per Service Request Quality Customer Satisfaction First Contact Resolution Rate (Incidents % Resolved Level 1 Capable Ticket Handling Average Incident Work Time (min) Average Service Request Work Time (min) Average Travel Time per Ticket (min) Service Level Average Incident Response Time (minutes) % of Incidents Resolved in 1 Business Day Mean Time to Resolve Incidents (business hours) Mean Time to Complete Service Requests (business days) Technician Technician Satisfaction New Technician Training Hours Annual Technician Training Hours Annual Technician Turnover Technician Absenteeism Technician Tenure (months) Productivity Technician Utilization Tickets per Technician- Month Incidents per Technician- Month Service Requests per Technician-Month Ratio of Technicians to Total Headcount Workload Tickets per Seat per Month Incidents per Seat per Month Service Requests per Seat per Month Incidents as a % of Total Ticket Volume 17

19 Sample Data Only IT Service and Support Benchmark for Higher Education Benchmarking KPI Performance Summary Metric Type Cost Productivity Service Level Quality Technician Contact Handling Key Performance Indicator (KPI) XYZ University Peer Group Statistics Average Min Median Max Cost per Inbound Contact $9.57 $13.79 $9.57 $13.33 $20.29 Cost per Minute of Inbound Handle Time $0.33 $0.44 $0.32 $0.43 $0.62 First Level Resolution Rate 86.7% 83.0% 67.6% 82.4% 97.5% Inbound Contacts per Technician per Month Technician Utilization 93.0% 54.3% 40.1% 53.0% 93.0% Technicians as a % of Total Headcount 80.4% 85.3% 75.4% 85.7% 94.5% Average Speed of Answer (ASA) (seconds) % of Calls Answered in 30 Seconds 13.5% 42.9% 13.5% 40.8% 87.9% Call Abandonment Rate 23.5% 8.5% 2.6% 7.8% 23.5% Call Quality 90.5% 85.2% 73.6% 85.7% 96.3% First Contact Resolution Rate 83.9% 73.5% 62.8% 72.2% 92.3% Customer Satisfaction 84.5% 78.4% 65.2% 77.6% 97.9% Annual Technician Turnover 28.4% 45.0% 12.3% 44.7% 86.3% Daily Technician Absenteeism 5.0% 10.7% 2.2% 9.6% 22.3% Schedule Adherence N/A 82.6% 71.7% 82.0% 92.6% New Technician Training Hours Annual Technician Training Hours Technician Tenure (months) Technician Job Satisfaction 70.0% 70.6% 52.5% 70.6% 92.1% Inbound Contact Handle Time (minutes) User Self-Serve Completion Rate 2.7% 8.4% 0.0% 6.4% 22.2% 18

20 Sample Data Only IT Service and Support Benchmark for Higher Education Quartile Rankings: Cost and Productivity Metrics Cost Metric Cost per Inbound Contact Cost per Minute of Inbound Handle Time First Level Resolution Rate 1 (Top) Quartile 2 3 $9.57 $11.69 $13.33 $15.68 $11.69 $13.33 $15.68 $20.29 $0.32 $0.38 $0.43 $0.38 $0.43 $ % 88.7% 82.4% (Bottom) $ % $ % 82.4% 79.2% 67.6% 4 Your Service Desk Performance $9.57 $ % Productivity Metric Inbound Contacts per Technician per Month Technician Utilization Technicians as a % of Total Headcount 1 (Top) Quartile % 56.5% 53.0% 56.5% 53.0% 49.6% 94.5% 88.6% 85.7% (Bottom) 49.6% 80.3% 40.1% 88.6% 85.7% 80.3% 75.4% 4 Your Service Desk Performance % 80.4% 19

21 Customer Satisfaction Sample Data Only IT Service and Support Benchmark for Higher Education Quality Metrics: Customer Satisfaction 98.0% 96.0% 94.0% 92.0% 90.0% 88.0% 86.0% 84.0% 82.0% 80.0% 78.0% 76.0% 74.0% 72.0% 70.0% 68.0% 66.0% 64.0% 62.0% 60.0% Key Statistics Customer Satisfaction High 97.9% Average 78.4% Median 77.6% Low 65.2% XYZ University 84.5% Service Desk 20

22 Quality (Effectiveness) Sample Data Only IT Service and Support Benchmark for Higher Education Cost vs. Quality for XYZ University Service Desk Higher Quality Middle Quartiles Effective but not Efficient Top Quartile Efficient and Effective XYZ University Service Desk Global Database Lower Quality Lower Quartile Middle Quartiles Efficient but not Effective Higher Cost Cost (Efficiency) Lower Cost 21

23 Module 3: Balanced Scorecard Core Topics Metrics Selection Metric Weightings Scorecard Construction 22

24 Sample Data Only IT Service and Support Benchmark for Higher Education The Service Desk Scorecard Performance Metric Metric Weighting Performance Range Worst Case Best Case XYZ University Metric Score Balanced Score Cost per Inbound Contact 25.0% $20.29 $9.57 $ % 25.0% Customer Satisfaction 25.0% 65.2% 97.9% 84.5% 59.0% 14.8% Technician Utilization 15.0% 40.1% 93.0% 93.0% 100.0% 15.0% First Contact Resolution Rate 15.0% 62.8% 92.3% 83.9% 71.6% 10.7% Technician Job Satisfaction 10.0% 52.5% 92.1% 70.0% 44.2% 4.4% % of Calls Answered in 30 Seconds 10.0% 13.5% 87.9% 13.5% 0.0% 0.0% Total 100.0% N/A N/A N/A N/A 68.7% Step 1 Six critical performance metrics have been selected for the scorecard Step 2 Each metric has been weighted according to its relative importance Step 3 For each performance metric, the highest and lowest performance levels in the benchmark are recorded Step 4 Your actual performance for each metric is recorded in this column Step 5 Your score for each metric is then calculated: (worst case actual performance) / (worst case best case) X 100 Step 6 23 Your balanced score for each metric is calculated: metric score X weighting

25 Balanced Scores Sample Data Only 75.0% 70.0% IT Service and Support Benchmark for Higher Education Balanced Scorecard Summary* Key Statistics 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% 35.0% 30.0% 25.0% 20.0% 15.0% 10.0% 5.0% 0.0% Service Desk Scores High 70.5% Average 42.4% Median 42.6% Low 14.1% XYZ University 68.7% Service Desk *The scores shown in the chart are based upon the performance metrics, weightings, and data ranges shown on the previous page. 24

26 Service Desk Balanced Score Sample Data Only IT Service and Support Benchmark for Higher Education Overall Service Desk Scorecard Trend 85% 80% 75% 70% 65% 60% 55% 50% 45% 40% Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 12 Month Average Monthly Score 25

27 Module 4: Best Practices Process Self Assessment Core Components XYZ University Self Assessment MetricNet Maturity Ranking Process Assessment Rollup 26

28 Six-Part Model for Service Desk Best Practices Model Component Definition Strategy Strategy Defining Your Charter and Mission Stakeholder Communication Customer Enthusiasm Human Resources Human Resources Proactive, Life-cycle Management of Personnel Process Expeditious Delivery of Customer Service Performance Measurement Technology Process Technology Leveraging People and Processes Performance Measurement A Holistic Approach to Performance Measurement Stakeholder Communication Proactively Managing Stakeholder Expectations 27

29 Best Practices Evaluation Criteria Ranking Explanation 1 No Knowledge of the Best Practice. 2 Aware of the Best Practice, but not applying it. 3 Aware of the Best Practice, and applying at a rudimentary level. 4 Best Practice is being effectively applied. 5 Best Practice is being applied in a world-class fashion. 28

30 Sample Data Only IT Service and Support Benchmark for Higher Education XYZ University Self Assessment Best Practice Strategy Best Practices Defined XYZ University's Score Peer Group Average 1 The Service Desk has a well-defined mission, vision, and strategy. The vision and strategy are welldocumented, and communicated to key stakeholders in the organization The Service Desk has a published Service Catalog, including a Supported Products List, that is distributed and communicated to key stakeholders including end users. The Service Catalog is available on-line. The Service Desk has an action plan for continuous improvement. The plan is documented and distributed to key stakeholders in the organization, and specific individuals are held accountable for implementing the action plan. The Service Desk is well integrated into the information technology function. The Service Desk acts as the "voice of the user" in IT, and is involved in major IT decisions and deliberations that affect end users. The Service Desk is alerted ahead of time so that they can prepare for major rollouts, or other changes in the IT environment. The Service Desk has SLA's that define the level of service to be delivered to users. The SLA's are documented, published, and communicated to key stakeholders in the organization The Service Desk has OLA's (Operating Level Agreements) with other support groups in the organization (e.g., level 2 support, desktop support, field support, etc.). The OLA's clearly define the roles and responsibilities of each support group, and the different support groups abide by the terms of the OLA's. The Service Desk actively seeks to improve Level 1 Resolution Rates, First Contact Resolution Rates, Level 0 Resolution Rates (User Self-Help), and Level -1 (Problem Prevention) Resolution Rates by implementing processes, technologies, and training that facilitate these objectives. Summary Statistics Total Score Average Score

31 Average Score Sample Data Only IT Service and Support Benchmark for Higher Education Best Practices Process Assessment Summary XYZ University Peer Group 30

32 Total Process Assessment Scores IT Service and Support Benchmark for Higher Education Overall Process Assessment Scores Key Statistics Total Process Assessment Score High Average Median Low 76.8 XYZ University World-Class

33 World-Class = Balanced Score Average = Sample Data Only IT Service and Support Benchmark for Higher Education Process Maturity vs. Scorecard Performance 100.0% 90.0% 80.0% XYZ University Performance Process Assessment Score Balanced Score 68.7% 70.0% 60.0% 50.0% Average = 51.0% 40.0% 30.0% 20.0% 10.0% XYZ University Global Database 0.0% Process Assessment Score 32

34 Module 5: Strategies for Improved Performance Core Components Conclusions and Recommendations Roadmap for World- Class Performance 33

35 Conclusions and Recommendations Conclusions and Recommendations fall into six categories 1. Strategy 2. Human Resource Management 3. Call Handling Processes and Procedures 4. Technology 5. Performance Measurement and Management 6. Stakeholder Communication 34

36 Performance Targets will be Established for Each Participant Performance Metric XYZ University Target Performance First Contact Resolution Rate 83.9% 85.0% First Level Resolution Rate 86.7% 88.0% Annual Technician Turnover 28.4% 25.0% New Technician Training Hours Service Desk Balanced Score 68.7% 70.5% Achieving the performance targets recommended above will increase the XYZ University Balanced Score from 68.7% to 70.5%, and put XYZ University in the top position on the Balanced Scorecard. 35

37 Module 6: Report Development and Presentation of Results Core Topics Conclusions and Recommendations Report Development Presentation of Benchmarking Results 36

38 Create Custom Benchmarking Reports 37

39 Individualized Presentation of Results The results of the benchmark will be presented in a live webcast XYZ University 38

40 Deliverables include IT Service and Support Benchmark for Higher Education Summary of Deliverables Project Participation Kit: Project Schedule Data collection questionnaires Project Kickoff Meeting Comprehensive Assessment and Benchmarking Report Project Overview and Objectives Industry Background Benchmarking Performance Summary Balanced Scorecard Best Practices Process Assessment Conclusions and Recommendations Detailed Benchmarking Data Live Webcast Presentation of Results via GoTo Meeting 39

41 Project Fees $5,000 for one Service Desk or Desktop Support Benchmark $7,500 for two Service Desk or Desktop Support Benchmarks Fee Schedule applies for 3 or more Benchmarks After May 27th, the participation fees will double to $10,000 for one benchmark, and $15,000 for two benchmarks. 40

42 Next Steps

43 Next Steps Visit our web page for the Higher Education Benchmark Review the FAQ s and Sample Deliverables Let us know if you have questions: info@metricnet.com Sign up for the Benchmark! 42

44 Project Timeline Sign-up and Registration Phase Benchmarking Report Produced by MetricNet April May June July August Data Collection Documents Due June 30, 2016 Presentation of Results 43

45 Question and Answer

46 You Can Reach MetricNet

47 Thank You!