Information Technology Analysis Hydro-Quebec Management Presentation. October 30th 2004

Similar documents
City of Virginia Beach Infrastructure & Operations Benchmark Results

The 80/20 Rule for Service and Support KPIs: The Metrics of Success!

The ROI of ITSM. Know Your Financial Impact! MetricNet Best Practices Series

The ROI of ITSM Know Your Financial Impact! Jeff Rumburg Managing Partner MetricNet, LLC

Efficiency First Program

Turbocharge Your Metrics. With Benchmarking. Jeff Rumburg, Managing Partner at MetricNet

Quantifying the Value of Software Asset Management

The REAL Value of IT Cost Transparency: Overcoming False Transparency. It Financial Management Transparency White Paper

Leveraging Metrics to Take Southwest Airlines to a Higher Plane Case Study

IT cost allocation The gateway to business alignment and mature service management if done correctly

Copyright Alinean, Inc. All rights reserved. HP OpenView ROI Analysis Questionnaire

Total Cost of Ownership (TCO) Calculator

Gartner IT Key Metrics Data

Total Cost of Ownership (TCO) Calculator

Staffing Strategies for the 21 st Century

Bryce Dunn Senior Product Manager Vantage. SLM/BSM & IT Business Alignment March 6, 2008

How Can I Better Manage My Software Assets And Mitigate The Risk Of Compliance Audits?

Service and Support as a Business

IT Service and Support. Key Success Factors in Higher Education

ITIL Qualification: MANAGING ACROSS THE LIFECYCLE (MALC) CERTIFICATE. Sample Paper 2, version 5.1. To be used with Case Study 1 QUESTION BOOKLET

A Report and Estimating Tool for K-12 School Districts California District Case Study

En Pointe Technologies

State of Washington. Technology Business Management (TBM) Connie Michener Washington Technology Solutions

Journey Up the IT Management Process Maturity Model To Assure IT Service Quality, Availability and Performance

Understanding the relationship between Asset Management and the CMDB

Progress in achieving goals is measured by: A N C H O R A G E: P E R F O R M A N C E. V A L U E. R E S U L T S.

Building a Business Case for Shared Services

Healthcare Providers vertical industry comparison

Managing the Smart Client. Presented by: David Degitz, ivision Matt McKinley, ivision Gabe Damiani, ivision Kyle Broussard, King & Spalding

Unleashing the Enormous Power of Call Center KPI s. Call Center Best Practices Series

1 Entire contents 2007 Forrester Research, Inc. All rights reserved.

Information & Technology 2015 OPERATING BUDGET OVERVIEW

Glossary 1. For a complete glossary of support center terminology, please visit HDI s Web site at HDI Course Glossary

A Report and Estimating Tool for K-12 School Districts Why Total Cost of Ownership (TCO) Matters

Company XYZ X Y Z. Peer Group Service Desk Benchmark. Company

Company XYZ X Y Z. Peer Group Service Desk Benchmark. Company

Advanced Metering Infrastructure (AMI) Example Project Valuation Model User Guide

From Good to Great The Emory Healthcare Success Journey! Your Speaker: Jeff Rumburg

IT Service and Support Benchmark

Session 102: Benchmark Your Way to World-Class Performance! Jeff Rumburg, Managing Partner, MetricNet, LLC

The SAM Optimization Model. Control. Optimize. Grow SAM SOFTWARE ASSET MANAGEMENT

Title: Configuration Management: The Core of IT Operations Session #: 495 Speaker: Donna Scott Company: Gartner

INFORMATION SERVICES FY 2018 FY 2020

Common Sense Strategies for Uncommon Times. Presented by: Tim Tindle, Executive Vice President & Chief Information Officer

Information Technology Department Anchorage: Performance. Value. Results.

The CXO s Guide To IT Governance

End-User IT Outsourcing Contract Number DIR-TSO-3666

ICT budget and staffing trends in Germany

Output Outsourcing in the General Office

Consolidating on iseries

ITFMA Conference. How to do I.T. Chargeback. Savannah, GA July 10, Pete Hidalgo

Technology Replacement Plan. Riverside City College: Addendum to the Technology Plan

ICT investment trends in the Netherlands. Enterprise ICT spending patterns through to the end of 2015 June 2014

Configuration Management in cloud environment

CIO Update: Use Creative Cost Containment

How to Drive Business Value with Capacity Management

CGEIT ITEM DEVELOPMENT GUIDE

Rethinking the way personal computers are deployed in your organization

CMMS REPORT JOURNEYMAN LEADER M P P A

On Demand Systems Management

Migrate with Altiris.

The Blueprint for Improving Service Delivery. Using Workload Analysis to Improve Service Delivery

Savings Show Success of IT Service Management Initiative

Title: HP OpenView Configuration Management Overview Session #: 87 Speaker: Loic Avenel Company: HP

Sine Qua Non. Jennifer Bayuk, CISA, CISM, CGEIT Independent Information Security Consultant Jennifer L Bayuk LLC 1

Reduce Production Incidents with Oracle Enterprise Manager 12c and give yourself a break! Roland Evers

The Business Case for Unified IT: Automated IT Service and Unified Endpoint Management Solution

Information Technologies RECOMMENDED BUDGET FY

Workstation Procurement Innovation

CITY OF JACKSONVILLE, FLORIDA

Status of Recommendations: Miami-Dade County Public School District Chapter 12 Student Transportation

Gartner Decision Tools for Vendor Selection

IT Service and Support Benchmark

Labor Categories Matrix for IT Professional Services

BUILDING CREDIBLE BUSINESS CASES FOR CONTACT CENTER PROJECTS: HOW TO USE PROCESS-DRIVEN ANALYSIS TO PRODUCE ROI PROJECTIONS THAT GET PROJECTS FUNDED

Global Workforce Analytics: The Next Big Thing? Featuring: Linda E. Amuso Radford Dan Weber Radford

IT Optimization Award

S U R V E Y I D C O P I N I O N. Cushing Anderson

Building a Foundation for Effective Service Delivery and Process Automation

Sir Winston Churchill once said,

Communicate the Value of Desktop Virtualization to the CXO

Company XYZ X Y Z. Peer Group Desktop Support Benchmark. Company

Progress in achieving goals shall be measured by:

Using assessment & benchmarking techniques as a strategic approach to drive Continual Service Improvement

BMC - Business Service Management Platform

<Project Name> Business Case

Session 309 C-Level Success, and the Secret Weapon of Service and Support Jeff Rumburg Managing Partner MetricNet, LLC

Using ClarityTM for Application Portfolio Management

Management Update: Enterprises Should Assess How Their IT Spending Stacks Up

The IBM Rational Software Development Platform

Focus and Success. Presenter: Amanda Dietz

IT Key Metrics Data Staffing Metrics Summary Report 2008

Linda Hall

Best Practices for Small IT Shops

Service Desk Metrics

In Support of (Business) Intelligence. A Technical Solution Paper from Saama Technologies, Inc.

Oakland County Department of Information Technology Project Scope and Approach

License Dashboard. All rights reserved.

REQUEST FOR PROPOSALS: INFORMATION TECHNOLOGY SUPPORT SERVICES

Service Business Plan

Transcription:

Information Technology Analysis October 30th 2004

Table of Contents Analysis Methodology Objectives and Scope of Analysis Executive Summary Page 1

Analysis Methodology x Page 2

Benchmark Methodology Overview Kickoff Data Collection Data Validation and Normalization Database Analysis Peer Comparison Executive Action Plan Results The methodology employed by Gartner analysts enables a composite overview of IT performance to be collected and evaluated against peer organizations. A six-phase process is undertaken, starting with data collection of workload and cost information for each functional area chosen. Once collected, this information is verified, corrected and normalized to ensure valid and equitable comparisons to other organizations with similar sizes, business functions, architectures and configurations. A key component of this analysis is the utilization of one key metric for each functional area that leads to the formation of the Executive Report. Page 3

Analysis Methodology Client Workload Client Costs Comparison of Key Metrics Applications Development and Support Mainframe Systems Midrange Systems Distributed Computing Networks (Voice and Data) IT Help Desk Packaged Enterprise Applications Total IT Expenditure Database The Rapid Assessment for Total IT Expenditure provides a comparison of an enterprise s key metrics with those of enterprises with similar workload drivers drawn from the Rapid Assessment for Total IT Expenditure database. Page 4

Methodology Quantitative and Qualitative Data Validation and Normalization Process Database Models Productivity Measurements Efficiency Measurements Quality Measurements General Observations Key Issues Recommendations Data Collection Comparative Analysis Insights Unique to Your Business Page 5

Individual Analysis Consensus Model Concepts Hardware Software Personnel Transmission Disaster Recovery Application Code MIPS Extensions Calls/Minutes Gigabytes Devices Call Volume Cost Workload Page 6

About the Total IT Expenditure Assessment The Rapid Assessment for Total IT Expenditure provides a health check for H-Q s IT environment. This is a high-level look at the current environment and covers a 12-month time period. A number of functional areas have been evaluated. For each functional area selected in this analysis, a composite peer group is formed for comparison purposes. The enterprises selected have key workload characteristics similar to those of H-Q. Each functional area has a different selected peer group. The selection of the different peer groups enables Gartner Measurement to compare H-Q with other enterprises based on key metrics. These key metrics are used to provide an indication of the cost-efficiency of the H-Q IS organization. Page 7

About the Total IT Expenditure Assessment The metrics listed for the areas selected for inclusion by represent a portion of the aggregated metrics for each key IT functional area that we believe will provide the health check information that CIOs and senior IT managers require, at a minimum, to develop an IT baseline for those functional areas. These metrics set the stage for a consistent methodology enabling the accurate identification of costs and workload attributes as well as an internal and external comparative analysis to determine how well each of the selected functional areas is performing. Page 8

Chart of Accounts Distributed & Help Desk Costs Assets Hardware Servers Clients Peripherals Network Software Operating systems COTS Applications Utilities Management IS Equipment Agent Equipment Labor Operations Technical services Planning and process management Database management Service desk Administration Finance administration Management Page 9

Objectives xand Scope Page 10

Objectives of the Analysis To establish a baseline for IT expenditures and cost-efficiency To provide a comparative analysis of expenditures against other enterprises with similar workloads, complexity and performance characteristics To provide a health check of the IT services delivered to the IT end-user community To identify potential opportunities for increased IT efficiency To offer recommendations to increase IT efficiency and effectiveness To create a foundation for a continuous program Page 11

Scope Functional Areas Studied IT Help Desk Distributed Computing Benchmark analysis covers the period from January 2003 to December 2003. $60.9M of s expenditures have been included in the consensus model. Page 12

Scope Peer groups Chosen on basis of comparable workload that maps to s scope of support, defined differently for each service being benchmarked Each peer group chosen based on its workload profile: IT Help Desk Number of inbound calls, including abandoned, and the overall complexity of the environment as defined by the type of calls taken Distributed Computing Number of end users supported, number of devices in the LAN environment and complexity of the environment Page 13

Executive xsummary Page 14

Key Summary Observations Overall, costs are 22.9% lower than what the peer groups would spend to support the same workload ($60,892K vs. $79,010K). Costs are lower than the previous benchmark in the 2 areas reviewed. Networking costs have been removed from the analysis Overall personnel costs compare favorably with the workload peer group. But there is considerable variance in staffing levels. Total FTE counts are 21.8% higher than the workload peer (365.8 FTEs vs. 300.3 FTEs). Total FTE costs in the areas reviewed are 3.4% lower ($29,610K vs. $30,650K). Average cost per FTE is 21% lower than the workload peer group ($80,944 vs. $102,053) In addition, Hydro Quebec uses a 35 hour workweek, which could account for up to a 12% differential in total headcount, since most organizations in the data base and the peer groups utilize a 40 hour workweek. The workload peer group is made up of primarily US companies since these provided the best workload match. The variance does not take into account the difference between US and Canadian salaries. Page 15

Key Summary Observations Complexity is high in the Distributed Computing area and has increased since the last analysis (11 vs. 8.8 in 99). Best Practices scores are higher than the database average in most areas. Operations management is a strength and inventory management is a weakness. End User satisfaction is very good and much improved over the previous analysis. Page 16

Total Cost by Technical Area Observations The aggregate IT consensus model costs for for those modules included in the analysis (at $60.9M vs. $79M) are $18.1M (or 22.9%) lower than what the composite peer group would spend to perform s workload. Technical Area Comparison Lower Costs -24.70% Distributed Computing H-Q Alignment -2.50% IT Help Desk -30% -25% -20% -15% -10% -5% 0% Page 17

Total Cost by Technical Area $90,000 Overall costs are $18.1M lower than the Peer would use to support the same workload. $80,000 $70,000 $60,000 $60.9M $79M $50,000 $40,000 $30,000 $20,000 $10,000 $0 H-Q PEER Distributed Computing $54,753 $72,713 IT Help Desk $6,139 $6,297 Page 18

Total Cost by IS Cost Category Observations s spending, by category, varies from the peers as show below: Cost Category Comparison Occupancy 169.09% Outsourcer -78.48% Transmission 167.91% H-Q Significantly lower costs by total dollar value Personnel -3.39% Alignment Software -21.74% Hardware -33.83% -100% -50% 0% 50% 100% 150% 200% Page 19

Total Cost by IS Component $90,000 $80,000 $79M $70,000 $60,000 $60.9M $50,000 $40,000 $30,000 $20,000 $10,000 $0 H-Q PEER Occupancy $2,598 $965 Outsourcer $1,760 $8,181 Transmission $205 $76 Personnel $29,610 $30,650 Software $5,328 $6,808 Hardware $21,392 $32,330 Page 20

Total Full-Time Equivalents by Area Observations The total number of full-time equivalent (FTE) personnel within the IT areas measured is 21.8% higher than the peer group FTE staff that would be required to support s workload. H-Q Peer Variance Total FTEs 365.8 300.3 21.8% higher FTE Count Comparison IT Help Desk 40.8% H-Q Distributed Computing 18.0% 0% 5% 10% 15% 20% 25% 30% 35% 40% 45% Page 21

Total Full-Time Equivalents by Area The charts numbers are inhouse FTEs. The workload peer has outsourcing which includes labor. 400.0 350.0 300.0 365.8 FTEs 300.3 FTEs also has outsourced services which includes labor. The equivalent FTEs outsourced for the workload peers are roughly: Help Desk : 7.3 FTEs Distributed : 26.6 FTEs Final normalized FTE count for the workload peer is: 250.0 200.0 150.0 100.0 50.0 23.3% higher 334.2 FTEs - H-Q PEER IT Help Desk 71.0 50.38 Distributed Computing 294.9 249.95 Page 22

Total Cost per Call - Cost Overview Help Desk Cost Comparisons $30 $25 $20 $15 $20 $21 $20 $24 $21 $10 $5 $0 HQ04 HQ99 Peer Util DBA HQ is in alignment with the 1999 analysis HQ is in alignment with the peer group HQ is 16.7% lower than the utility peer group HQ is in alignment with the database average Page 23

Total Cost per User - Cost Overview Distributed Cost Comparisons $3,500 $3,000 $2,500 $2,000 $1,500 $2,381 $3,332 $3,161 $3,000 $2,343 $1,000 $500 $0 HQ04 HQ99 Peer Util DBA HQ is 29% lower than the 1999 analysis HQ is 25% lower than the peer group HQ is 21% lower than the utility peer group HQ is in alignment with the database average Page 24

Best Practice Overview vs. DBA 6.1 vs. 5.3 6.9 vs. 5.6 Overall Average Technology Planning and Process Management Total Best Practice Comparisons 0.8 1.3 2.7 vs. 3.4-0.7 Training Total 7.2 vs. 5.4 Customer Service Total 1.8 5.4 vs. 5.5 Asset Administration Total -0.1-0.02 6.8 vs. 5.9 Operational Management Total 0.9 7.4 vs. 5.9 Change Management Total 1.6 DBA -1.0-0.5 0.0 0.5 1.0 1.5 2.0 This chart shows how compares to the database average in each of the primary Best Practice areas. Negative scores are below the database average and the positive score is above average. The numbers on the left are the actual scores of H-Q and the corresponding db average. Page 25

Observations What s Good has done a good job in taking real action to address the previous benchmark s issues and have made improvements across the board. Costs Overall cost performance is better than, or in alignment with, all the peers and the database average. Cost has improved over the previous assessment ( 99 ITOA) and come into better alignment with the peer groups and database average. Support Quality User satisfaction has significantly improved over the previous assessment. The current environment seems more stable than in recent years and is contributing to this. Currently the quality scores shown a marked improvement in quality of services and user satisfaction and exceed the database average in all major areas. Users consider the Help Desk their Best Source of Technical Support as well as their Best Source of How To Support. This is uncommon with most organizations. Page 26

Observations What s Good Best Practices There are a number of areas that can be considered best practices in the Hydro- Quebec IT organization. Customer Service scores are strong with a score of 7.2 vs. 5.4 for the db. This has helped to reduce costs in the help desk area with superior leveraging of technologies and processes. Operation Management scores are strong and considered a best practice as well at 6.8 vs. 5.9 for the db. Strengths include security and standards compliance. Security efforts are a best practice with a score of 7.5 vs. 3.8 for the db. Most organizations are not as thorough where security is concerned. Overall Change Management score is very good with a score of 7.4 vs. 5.9 for the db. Processes and technologies are superior in this area, contributing to overall lower costs while sustaining higher quality scores. Change management technology and change management processes are very strong with scores of 6.1 vs. 3.9 and 9.0 vs. 6.9 respectively for the db. Combine this with superior deskside support (as identified by end users) and this area can be considered a best practice for (Change management includes Processes, Deployment, Retirement & Moves, Technology). Page 27

Observations What s Good End Users The majority of end user s prefer the help desk. 60% say that it is the best source of technical support vs. 52% for the workload peer, 58% for the utility peer and 50% for the db. 77% say that it is the best source of how to support vs. 18% for the workload peer, 21% for the utility peer and 23% for the db. This directly contributes to reduced downtime and lost productivity for the given complexity. Page 28

Observations What to Review As can be expected, there is always room for improvement. The following pages list the more significant points to review and address going forward. There are several areas where could gain significant improvement in processes, automation and integration. Asset Administration with a score of 5.4 vs. 5.5 for the db. Hardware Inventory Management score is a 4.2 vs. 5.5 for the db. Software Inventory Management score is a 4.2 vs. 4.0 for the db. Lifecycle Management score is a 5.5 vs. 5.7 for the db. All training with a score of 2.7 vs. 3.4 for the db. End User Training score is a 1.7 vs. 2.7 for the db. IS Training score is a 3.7 vs. 4.1 for the db. Page 29

Observations What to Review Staffing levels remain higher than all the peers and database average. This is a driver of direct costs. Overall Help Desk cost per FTE is lower than the peers and database average. $73,829 vs. $87,942 for the workload peer, which is 16% lower 71.0 FTEs vs. 50.4 FTEs for the workload peer, which is 41% higher Overall costs are 18.2% higher at $5,238K vs. $4,431K Overall Distributed cost per FTE is lower than the peers and database average. $82,656 vs. $104,897 for the workload peer, which is 21% lower 294.9 FTEs vs. 249.9 FTEs for the workload peer, which is 18% higher Overall costs are 7% lower at $24,372K vs. $26,219K Salary levels remain lower than the peer groups and database average for these of the areas. Hydro Quebec uses a 35 hour workweek, which could account for up to a 12% differential in total headcount, since most organizations in the data base and the peer groups utilize a 40 hour workweek. Page 30

Strategies for Improved Performance 1. Asset Management Findings: Best practices as well as analysis show less than optimal performance in most areas of process maturity, automation and integration. Overall score is a 5.4 vs. 5.5 for the database average. Hardware inventory management is an issue with a score of 4.2 vs. 5.5 for the db. Software inventory management score is not great, though higher than the db. Lifecycle Management is issue with a score of 5.5 vs. 5.7 for the db. Non-optimal controls and tracking can have a wide-ranging impact to support, maintenance and costs. Asset Administration Comparisons 7.5 vs. 6.4 Vendor Management 1.1 5.8 vs. 5.7 5.5 vs. 5.7 4.2 vs. 4.0 Procurement Life-Cycle Management Software Inventory Management 0.1-0.2 0.2 4.2 vs. 5.5 Hardware Inventory Management DBA -1.5-1.0-0.5 0.0 0.5 1.0 1.5-1.3 Page 31

Strategies for Improved Performance 1. Asset Management (Cont) Findings: Operations staff is higher than the peers and database average. This is in part due to issues around management and tracking of IT assets. Lifecycle management is basic. This contributes to: An aging inventory and cascading of assets resulting in greater support and maintenance resource requirements (from deskside support to the help desk), A growing inventory of disparate equipment that becomes harder and harder to track And a negative impact to productivity as equipment fails or becomes unable to perform to even-increasing requirements. Actions: Fully funding, implementing, integrating and actually leveraging an ITIL system will greatly improve your asset management issues. Build a plan to improve key processes in critical need of improvement. These include all aspects of hardware and software inventory management, lifecycle management as well as procurement. Leverage the Governance Council in your efforts in building a consolidated and centralized asset management foundation. Page 32

Strategies for Improved Performance 2. Distributed Complexity Findings: Overall complexity has risen significantly since the previous benchmark. Currently complexity is 11 (high) vs. 8.8 ( 99). Staffing levels are higher in all areas reviewed to support the elevated complexity in the environment at current quality of support. Hardware and software complexity is higher than the database average. Low HW and SW best practice scores, describing processes and technologies, are contributing to the negative impact of the higher complexity. has higher complexity than the average utility organization (8.8) and higher than all but 1 peer in the utility peer group. Actions: Establish enforceable enterprise-wide standards and desktop policies that are uniform, clearly defined, are in accordance with best practices and are capable of having a cost cutting impact to the enterprise. Get approval to enforce these standards and policies at the executive level, doing so will go a long way to reducing asset costs, repair costs, troubleshooting time and support demands. Leverage the Governance Council in your efforts. Investigate other ways of reducing complexity in the Distributed environment. Page 33

Strategies for Improved Performance 3. Training Findings: Training is lacking enterprise-wide for. 72% of users receive no training for standard desktop applications 60% of users receive no training for alternative standard app training 56% of users receive no training for custom business applications End user training best practice is a 1.7, which is lower than the industry low database average of 2.7 and IS training is a 3.7 vs. 4.1 for the database average. Lack of effective training will ultimately impact efficiency, productivity and drive up support costs. Percent of users recieving no training 90 80 70 60 50 40 30 20 10 0 Form al DBA Cus tom DBA Alternative DBA Standard Busines s Standard Apps Apps Apps Page 34

Strategies for Improved Performance 3. Training (Cont) Actions: Review the specific training needs of the various users and departments Make training a priority and allocate the proper funds Establish training programs to meet the needs of the users Best-in-class organizations spend 7 percent to 10 percent of IT payroll on training. A spending level of 5 percent can ensure a well-trained staff. Technical IT training programs typically yield 25 percent to 75 percent ROI of the initial investment. For every thousand dollars spent on training, the organization adds $250 to $750 to the bottom line. Page 35