A Case Study of Strategic Metrics Use in a CMM- Based Outsourcing Environment

Similar documents
Process Improvement Proposals (PIPs) Organization, Team, Individual

Business Value and Customer Benefits Derived from High Maturity

Rational Software White Paper TP 174

Reliability Improvement using Defect Elimination

Unleashing the Enormous Power of Call Center KPI s. Call Center Best Practices Series

Debra J. Perry Harris Corporation. How Do We Get On The Road To Maturity?

Administration Division Public Works Department Anchorage: Performance. Value. Results.

Financial Intelligence NCIA 2016 National Training Conference Pittsburg April 18 th & 19 th

Leading Indicators for Systems Engineering Effectiveness Presentation for NDIA SE Conference October 28, 2009

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

Energy Future Holdings (EFH)

Information Technology Analysis Hydro-Quebec Management Presentation. October 30th 2004

Aligning Process Redesign and Change Management with Project Management (System Implementation Projects)

FY17-FY18 Audit Plan. Office of Internal Auditing

SLA Defined Metrics as a Tool to Manage Outsourced Help Desk Support Services

The Enterprise Project

Service Level Agreement Policy. Table of Contents

AIS Electronic Library (AISeL) Association for Information Systems. Mark Borman University of Sydney,

TrialStat Corporation: On Schedule with High Quality and Cost Savings for the Customer*

Watts Bar Nuclear Plant Unit 2 Completion Project

Leveraging Smart Meter Data & Expanding Services BY ELLEN FRANCONI, PH.D., BEMP, MEMBER ASHRAE; DAVID JUMP, PH.D., P.E.

Smart Outsourcing: Strategic Alignment, Risk Management, and New Relationships

CORROSION MANAGEMENT MATURITY MODEL

VC SOFTWARE PROJECT MANAGEMENT PLAN

ON-THE-JOB TRAINING BLUEPRINT

Premium Sample Reports

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

SWEN 256 Software Process & Project Management

Developing a Successful Product

PART THREE - WORK PLAN AND IV&V METHODOLOGY WORK PLAN. FL IT IV&V Work Plan

Introduction and Key Concepts Study Group Session 1

Call Center Benchmark India

From Both Sides: TSP from Within & Beyond

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

Successful Delivery of Change How Managing Benefits is helping Dubai Customs optimize its return on investment

Dynamic Reallocation of Portfolio Funds

5 Star London Hotels - Example Report

The Basics of ITIL Help Desk for SMB s

An Overview of the AWS Cloud Adoption Framework

IDC MaturityScape Benchmark: Big Data and Analytics in the United States

Business Planning and Governance for Corporate Training

Introduction and Key Concepts Study Group Session 1

PMO17BR303 The Big Bang Cerner s Approach to Agile Transformation Matt Anderson, PMO Director Cerner Corporation

Bank of Ireland. Service Integration as a means to govern a multivendor. 11 th October 2013

Brighton & Hove is committed to providing restorative practice that is safe, secure and effective and compliant to national standard

David J. Anderson. Kanban & Accelerated Achievement of High Levels of Organizational Maturity.

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Electric Forward Market Report

The journey to procurement excellence

Call Center Best Practices

TRINITY MOTHER FRANCES HOSPITALS AND CLINICS. Robert Jehling Associate Vice President and Chief System Access & Patient Experience Officer

Balanced Scorecard IT Strategy and Project Management

ABB ServicePro 4.0 Service Management System

Meter Data Management System (MDMS) Sharing. Ricky Ip CLP Project Manager

Driving Cost Out of the Learning Organization:

Breaking the Silo Mentality Working With Opinion Leaders

Schedule Guidance Document

Adding Value by Proactively Managing Departmental Risks

Using TSP to Improve Performance

New Horizons Computer Learning Centers

PMI Scheduling Professional (PMI-SP)

Implementation Guide 2000

Business Process Services: A Value-Based Approach to Process Improvement and Delivery

Enterprise Technology Projects Fiscal Year 2012/2013 Fourth Quarter Report

IT Methodology Webinar

Our Experience of the SCAMPI E Pilot SEPG North America, Pittsburgh October 1st, 2013

CIO Metrics and Decision-Making Survey Results

Integration Competency Center Deployment

IIA 2015 Worldwide survey of 15,000 internal auditors

S U R V E Y I D C O P I N I O N. Cushing Anderson

ASX CHESS Replacement Project Webinar

ITIL from brain dump_formatted

THE CASE FOR QUALITY: MOVING FROM COMPLIANCE TO PERFORMANCE IN THE REGULATORY WORLD

Scott Adams DILBERT. Manage It! Your Guide to Modern, Pragmatic Project Management. Johanna Rothman

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

IT Service and Support. Key Success Factors in Higher Education

Measure What You Manage. Michael Cowley, CPMM President CE Maintenance Solutions, LLC

12% 16% Faster full resolution time. Faster first reply time. Customers who use analytics are more responsive and effective

Practical Process Improvement: the Journey and Benefits

AoC and Organizational Reviews: Supporting ICANN Accountability. ICANN53 24 June 2015

FACEBOOK USER-GENERATED CONTENT (UGC) BENCHMARK REPORT

The Call Center Balanced Scorecard

TECHNOLOGY brief: Event Management. Event Management. Nancy Hinich-Gualda

Reliability Assurance Initiative. Sonia Mendonca, Associate General Counsel and Senior Director of Enforcement

Implementing ITIL Best Practices

Offshore Enablement Process

7. Model based software architecture

Co-Sourcing First Principles First; Simple Best practices to get the most out of an outsourcing relationship

Using Lean Six Sigma to Accelerate CMMI Implementation

How Can I Better Manage My Software Assets And Mitigate The Risk Of Compliance Audits?

HIGH MATURITY CONFERENCE. Christian Hertneck June 2014

2017 LOSS PREVENTION / RISK MANAGEMENT SEMINARS APRIL 20, 2017 DALLAS, TEXAS

Nedbank Retail & Business Banking

Advancement/Development

An Oracle White Paper December Reducing the Pain of Account Reconciliations

Measures for Assuring Projects

Internal Medicine Faculty Affairs Staffing Analysis Program & Operations Analysis University of Michigan Health System

FAA s Experience with Process Improvement & Measurements

POSSE System Review. January 30, Office of the City Auditor 1200, Scotia Place, Tower Jasper Avenue Edmonton, Alberta T5J 3R8

Transcription:

Association for Information Systems AIS Electronic Library (AISeL) AMCIS 2000 Proceedings Americas Conference on Information Systems (AMCIS) 2000 A Case Study of Strategic Metrics Use in a CMM- Based Outsourcing Environment Eugene G. McGuire American University, mcguire@american.edu Erin A. McKeown Keane, Inc., karen_a_mckeown@keane.com Follow this and additional works at: http://aisel.aisnet.org/amcis2000 Recommended Citation McGuire, Eugene G. and McKeown, Erin A., "A Case Study of Strategic Metrics Use in a CMM-Based Outsourcing Environment" (2000). AMCIS 2000 Proceedings. 4. http://aisel.aisnet.org/amcis2000/4 This material is brought to you by the Americas Conference on Information Systems (AMCIS) at AIS Electronic Library (AISeL). It has been accepted for inclusion in AMCIS 2000 Proceedings by an authorized administrator of AIS Electronic Library (AISeL). For more information, please contact elibrary@aisnet.org.

A Case Study of Strategic Metrics Use in a CMM-Based Outsourcing Environment Eugene G. McGuire, Computer Science and Information Systems, American University, mcguire@american.edu Karen A. McKeown, Keane, Inc., Boston, MA, Karen_A_Mckeown@keane.com Abstract IT firms that specialize in outsourcing must provide assurances to their customers that they are adding value to that business relationship. The purpose of this paper is to describe a practical set of metrics that are focused on customer satisfaction and that are easily understood by both customer and developer organizations. The metrics established by Keane, Inc., a large US-based IT services firm, are based upon the goals and concepts of the Software Engineering Institute's (SEI) Capability Maturity Model (CMM ) for software. Introduction Outsourcing is one of the fastest growing segments of the IT market. For example, IDC estimates that the number of large outsourcing contracts rose 100% between 199 and 1998 and Chris Pickering s 1998 Survey of Advanced Technology reported that 5% of organizations surveyed have significant backlogs of IT work making outsourcing an increasingly attractive option for many CIOs. Dataquest, an IT industry research firm, estimated this market at approximately $116 billion in the U.S. and $80 billion in Europe for 1999. Industry sources believe these amounts represent approximately 20% of the total expenditures for software development and management. Most IT-related spending is currently allocated to inhouse delivered initiatives. Industry analysts, however, forecast a greater share of this spending will rapidly shift to external service providers. Outsourcing, whether in the plan, build, or manage phases, can yield faster time to market and hence a competitive advantage in leveraging technology to achieve greater business value. Keane, Inc. is a $1 billion IT services firm headquartered in Boston, MA that has positioned itself to focus on the large and rapidly growing outsourcing market by integrating the SEI s CMM for software into their application development and management methodology. They believe that organizations will increasingly seek to outsource the management of their application software as a strategic means for achieving process improvements. Keane educates its customers that by improving the software management process, businesses can significantly improve productivity, achieve quicker development cycles, lower application support costs, and improve quality. In short, their software applications will better support their business. Keane has sought to differentiate itself in the outsourcing marketplace by emphasizing he integration of the CMM into their application development and management methodologies. Considering the volatile nature of the outsourcing market it is not unusual to find some outsourcing arrangements that are focused simply on the lowest common denominator, i.e. providing a specified (and usually limited) set of services at a competitive cost. Keane, in comparison, strategically leverages their rigorous and comprehensive application management methodology (AMM) with the CMM to provide customers with a strong process-driven environment that often adds significant value beyond the initial outsourcing contract. A central component of Keane s CMM strategy is the collection and strategic use of metrics in their outsourcing engagements. The systematic collection and analysis of appropriate metrics can be an invaluable component of a rigorous feedback and control process whereby software development and maintenance organizations are able to verify that performance levels are within the bounds of established customer expectations. Metrics programs, however, have been notoriously difficult to implement in many organizations and, in many cases, have not progressed beyond simple measurements of schedule, cost, and level of effort. While these basic measurements provide some project management guidance, they are often insufficient in providing strong evidence of customer satisfaction. The software Capability Maturity Model (SW-CMM ) developed by the Software Engineering Institute (SEI) requires the basic metrics set of schedule, level of effort, size, and critical computer resources just to reach CMM Level 2. Part of the rationale behind this set of metrics is that measurement baselines need to be established for individual projects so improvement goals can be established for each project in these areas. At CMM Level 3, the Software Engineering Process Group (SEPG) is systematically analyzing this data, which now resides in an organizational database, to design and implement organization-wide improvement plans that target these specific areas, e.g. increased schedule control and predictability. 429

While these measurements and these improvement efforts are certainly translatable back to customer satisfaction, schedule issues are only one quality area in which customers now have high quality expectations. The CMM Level 4 Key Process Areas of Quantitative Process Management and Software Quality Management drive software development and maintenance organizations to more fully identify and then meet customer expectations of quality. The data collected and analyzed by higher maturity organizations are frequently utilized to educate and fully inform the customer on standard control limits, identifying variations away from these control limits, and courses of corrective action for when these variations occur. As a result these metrics are highly influenced by customer expectations of quality in many areas. This paper presents a set of metrics that can be gathered while organizations are at Levels 2 and 3 of the CMM but that are also highly useful for Level 4 efforts. These metrics are focused on maintaining control over customer expectations by providing both developer and customer organizations with an ongoing report of contract compliance. Background There has been a good amount of recent discussion on the practical implementation and use of metrics as organizations attempt to gain a quantitative understanding of their software projects. Daskalantonakis (1992) provides a multidimensional view of metrics that encompasses usability, categories, users, user needs, and levels of metrics in the context of a widespread and successful organizational metrics program. His conclusion is that metrics can only show problems and that it is the actions taken as a result of analyzing the measurement data that produces results. Also, Schneidwind (1992) proposes a comprehensive metrics validation methodology to integrate quality factors, metrics, and quality functions. Criteria such as consistency, predictability, and repeatability are identified as critical to the success of a metrics program. Metrics programs are currently receiving increased attention as many organizations attempt to achieve Level 4 in the CMM (Chatmon & Holden, 1999; Felschow, et al, 1999; Florence, 1999; Harvey, 1999; Natwick, 1999; Purcell, 1999). These authors all describe current efforts at implementing metrics programs within their organizations. Common themes include identifying the business value of the metrics, establishing quality goals and insuring that the data provide consistent information. The following sections of this paper present the practical implementation of a rigorous metrics program that is in place at Keane, Inc., an international IT solutions firm whose objective is to help clients plan, build and manage application software. The major components of Keane s metrics program include the Project Control and Reporting Process, the Project Status Display Workbook, Quality of Service Reports, Service Level Agreement Reports and Software Quality Assurance Audit Reports. Project Control and Reporting Process The foundation of Keane s metrics program is the Project Control and Reporting Process (PCRP). The process was developed to provide management with a snapshot of compliance with corporate project management standards and to obtain an early indication of issues that may impact cost, schedule or quality. PCRP standards identify critical measurement points before, during and after a project and set the stage for on time, on budget delivery of a quality product. Since Keane s various methodologies are built around a common four-phase framework, the PCRP was similarly configured to facilitate the establishment and execution of quality and measurement checkpoints. Adherence to the standards is quantified on a project report card, using a scale of 1 (poor/unacceptable) to 4 (excellent/fully meets requirements), and is summarized at the branch and corporate level. This provides a point-intime view of project progress and compliance at all levels of the organization. Projects rated below a defined minimum score are placed on a corporate watch list, and must develop and execute a plan to bring the project back within acceptable limits. Ratings are performed and report cards issued on a quarterly basis or at the completion of a project phase. PRAM Profile Variables Impact Range Definitions Positive Negative Impact Impact -10% 0% 10% 20% 30% SME Availability Gather info. Lost time due More to unavailability efficiently 1 Critical path Review and Signoff of Lost time in delivery not Deliverables critical path affected 2 Vendor Inventory Vendor letters Completeness/Accuracy sent on time Increased duration and scope 3 4 5 6 8 TOTAL CHANGE OF SCOPE BUDGET = Reduces Increases 20.00% Mitigation Action Schedule in advance Weekly review and focus on signoff Client IDs vendors and prioritizes prior to Reduces Increases -5% 10% -5% 5% -5% 20% Phase 1: Proposal Development As a proposal for services is being developed, the risks associated with the project are assessed, quantified and graphically represented using the Project Assessment Method (PRAM) Profile. The PRAM provides 430

measurements that may indicate adjustments to a proposed estimate or schedule. Phase 2: Project Initiation During the project initiation phase, the project management environment is established, the defining documents (i.e., statement of work or service level agreement) and project plan are prepared, and the PRAM Profile is re-evaluated. Ratings are applied to each of these deliverables. Phase 3: Project Execution Throughout the project execution phase, PCRP monitors and reports on the following attributes: 8" team status meetings 8" weekly project status report 8" weekly status review with client 8" maintenance of a project notebook 8" project plan updates 8" change control procedures 8" acceptance procedures 8" Project Summary Display (PSD)/trend reporting (see below) 8" monthly branch project review 8" branch support 8" client satisfaction Phase 4: Post Project Summation At the conclusion of a project, PCRP requires that all deliverables have been formally accepted by the client, the project notebook and other key assets used to manage the project have been archived, and a lessons learned document has been prepared by the project manager. Project Status Display Workbook The Project Status Display (PSD) Workbook is a tool that enables project managers to track and report project status and financial results at a deliverable level and provides client management with a summarized view of the project on a weekly basis. Based on data from the project plan, the PSD is maintained with an Excel workbook, consisting of six worksheets: Project & Billing information General information for the initiation of the project is recorded, including a project number, client number and other standard information that will be used as headings for the other sheets in the workbook. Planned Includes planned resources, billing rates and weekly hours. Actual Records actual resources assigned, billing rates, and actual hours spent on the project. The estimated hours to complete is captured and used to project variances. Project Status Summary (PSS) Data Sheet For each deliverable in the project plan, the estimated effort hours and cost, actual effort hours and cost, client acceptance, and any change control applied are updated weekly. Formatted PSS A tabular report computed from the data sheet. Summary Sheet A graphical and tabular summary of the project's value and actual costs, an analysis of variance, and notes related to change control. Significant variations between planned and actual performance must be addressed by project management through a formal action plan. The tabular summary below contains the following computed fields: Original Contract Value The total original estimate of effort and value approved at contract award. Total Approved Changes Total effort and value of approved changes to be performed under change control terms. Total Current Estimate Original contract value plus approved changes. Activity to Date Actual effort hours and associated value (billing rate*hours) as well as any non-effort expended to date on all products in the project plan. Estimate to Complete The effort and associated value, as well as any non-effort associated value remaining to be expended on all products in the project plan. Forecast Total The sum of the Activity to Date and the Estimate to Complete. Project Variance The total variance between all estimated and all actual effort and value expended. Earned Value of Approved Products The value of all delivered products original estimates plus their approved change estimates. This does not reflect actual costs incurred (as computed in the Activity calculations). Typically used for fixed price or flat monthly billing where the value of approval is based on planned rather than actual effort. Actual Value of Approved Products The actual value of all delivered products. This does not reflect actual costs incurred from the activity calculations. Typically used for time and materials billing projects where the value of approval is based on actual not planned effort. Current Project Billing For time and materials projects, the cumulative billing amount through the As of Date of the project. 431

Branch Branch Name Branch Number 333 Client Client Name Client Number 1111 Project Project Name Project Number 222 Project Manager Joe Cool As of Date 10/4/9 $25,000 Current Project Value Project Summary Dollars Days $20,000 Original Contract Value $21,600 42 Total Approved Changes ($10) 1 Total Current Estimate $21,590 43 Dol lars $15,000 $10,000 Current Contract Value Expended to Date Estimate to Complete Activity to Date $6,000 10 Estimate to Complete $11,600 1 $5,000 Forecast Total $1,600 2 Project Variance $3,990 16 5/1 $0 0/9 5/3 0/9 6/1 9/9 /9/ 9 /2 9/9 8/1 8/9 9// 9 9/2 /9 Earned Value of Approved Products $10,000 Actual Value of Approved Products $6,000 Project Week, Actuals as of 10/4/9 Current Project Billing $10,000 Quality of Service Surveys The Deliverable Plan below shows the following data plotted as dollars (Y-axis) over time (X-axis): Plan Value For all products, shows the Current Estimate value of each product at its planned delivery date. Earned Value For delivered products only, shows the Current Estimate value of products already delivered by the As-of-Date. The value is the sum of all products original estimate plus any approved change estimates. Actual Value For delivered products only, shows the value of the effort actually expended on delivered and accepted products. Delivered Value $25,000 $20,000 $15,000 $10,000 $5,000 $0 5/10/9 5/30/9 6/19/9 Deliverable Plan /9/9 Actuals, as of 10/4/9 The Current Project Value Chart below shows the following data plotted as dollars (Y-axis) over time (X-axis): Current Contract Value For all products, shows the Project Budget. Expended to Date For delivered products only, shows the actuals to the As-of- Date to the Project Forecast. Estimate to Complete Shows Project Forecast estimates from the As-of-Date to the end of the project. /29/9 8/18/9 9//9 Plan Value Earned Value Actual Value Although customer satisfaction is one of the attributes that is regularly monitored and quantified through PCRP audits, its focus is typically at a client sponsor level. Quality of Service surveys are intended to solicit feedback from end users, where perspective of quality and satisfaction may differ significantly from client management. Surveys are distributed to individuals in customer business units at predefined intervals, or at completion of a deliverable. The survey consists of a standard set of questions designed to assess what went well and what did not during the specified period, so that best practices and opportunities for improvement can be identified and addressed. End users are asked to rate the quality of service provided on a scale of 1 (poor/ unacceptable) to 5 (excellent/exceeds expectations). Typical questions include: To what extent were expectations met? How well were requirements met? What is your satisfaction with the professionalism of the team? To what extent were you kept informed of the status of your request? Was your request fulfilled properly the first time? Service Level Agreement Metrics The Service Level Agreement (SLA) is an essential tool for managing service-based projects. It defines the scope and objectives of the project in terms of services that will be provided and helps to guarantee a mutual commitment between Keane and the customer. The SLA establishes the volume of work products that will be delivered, the priority of the services provided and acceptance criteria for responsiveness and quality of the deliverables. It becomes the reporting vehicle for performance measurement and provides the opportunity to identify service level improvements throughout the project. Below are suggested minimum metric components of a SLA. 432

Activity Cost Cycle Time Quality Volume activities such as root cause analysis resulted in a significant decrease in production support effort hours over the course of three years. Production Support User Support Maintenance Requests Enhancement Requests Development Requests Management Control Average # Calls Response Time Hours of Operation % Hours Average Response Time (on/ off shift) Average Time to Resume Business % Complete by Due Date % Complete by Due Date % Complete by Due Date # Calls Hours of Operation # Requests Completed # Defects per Request # Requests Completed # Defects per Request # Requests Completed # Defects per Request The project manager typically reports performance against SLA commitments to the customer and Keane corporate on a monthly basis. Trends over time are used to track productivity and performance improvements. As shown in the sample chart below, process improvement 1200 1000 800 600 400 200 0 Production Support Hours Jan Feb Mar Apr May June Jul Aug Sep Oct Nov Dec Year 9 Year 98 Year 99 Software Quality Assurance Audits A Software Quality Assurance (SQA) Plan is developed at the beginning of a project in conjunction with the project plan, to identify the quality checkpoints. SQA audits focus primarily on compliance to defined processes. To provide maximum business value, processes which will be included in the audit schedule are mutually agreed to by SQA and project management. Standard processes incorporated into all SQA Plans include audits and/or reviews of peer reviews, software configuration management, project plans and/or service level agreements, statements of work and other defining documents and the preparation and execution of test plans. Other process audits more specifically related to the project are added to the plan as necessary and appropriate. Non-compliance issues identified during an audit are analyzed to determine whether: any steps in the process were skipped any steps not defined in the process were performed the order of execution was changed Analysis of these points provides a solid basis for determining whether process improvements may be indicated or additional training for the team may be required. SQA is responsible for making recommendations to the SEPG who has the authority to act on these recommendations. Additional SQA responsibilities include tracking, trending and analysis of defects identified at various stages of the development lifecycle. The major classifications of defect tracked are: Defects identified through peer reviews (# of defects, type, severity, SDLC phase) as a means of providing management with insight into areas where process improvements may be indicated, or additional training for the team is needed. Defects discovered during the course of a process audit (#, type, severity). Defects discovered during any phase of testing (# of defects, type, severity) Defects identified by the end user during acceptance (#, type, severity) Production rework, defined as defects discovered after a deliverable has been placed in production (# of items returned, type, origination). Analysis of the phase in which defects were discovered should prompt SQA and the SEPG to investigate where earlier defect identification efforts were inadequate so that those efforts can be improved to incorporate additional quality control checkpoints. 433

Conclusion 20 15 Defects 10 Evaluating Metrics 5 0 Jun-98 Jul-98 Aug-98 Sep-98 Oct-98 Peer Review 10 12 10 5 Audit Discovery 0 0 1 0 0 Test 3 1 1 2 2 Acceptance 0 1 0 2 2 Production Rework 2 3 2 4 6 No metric is useful unless the organization can identify the business value it provides. Frequently cited indicators of business value for metrics are (Humphrey, 1989; Paulk, 1999): Is the metric a good indicator of how well the process is performing, e.g., an indicator of efficiency or effectiveness? Can the values for this metric be predictably changed by changing the process or how the process is implemented? Can the metric be consistently reproduced by different people? Can data be collected and analyzed such that you can predict and/or control process performance? Is the data relatively easy and cost-effective to obtain? Is the metric one that the customer thinks is an important indicator or process and/or product quality, e.g., an indicator of reliability? Is the metric one that the customer requires be reported? Is the metric one that the end user thinks is an important indicator of process and/or product quality, e.g., an indicator of usability? Is the metric one that senior management thinks is an important indicator of process and/or product quality? Is the metric one the organization requires to be reported, i.e., is it one of the common, standard measures defined for the organization? Is the metric one that the project manager thinks is an important indicator of process and/or product quality, e.g.,. an indicator of progress? Metrics have little value if they are not aligned with the business objectives of the organization at large and are useful and consistent on the project level. In addition, customer satisfaction plays an increasingly larger role in quality measures. As organizations attempt to progress up the CMM maturity levels, they must insure that they are capturing the useful metrics, analyzing them in a consistent manner and then taking appropriate actions as a result of the analyzed data. The metrics framework presented in this paper illustrates how one large IT consulting organization is using metrics to provide both internal and customer-focused feedback on core operating procedures. It is also clear that this metric framework meets many if not all of the evaluation criteria specified in the previous section. References Chatmon, A.I. & Holden, J.J. Lessons Learned for Level 4 Piloting to Achieve Data and Process Consistency, Daskalantonakis, M.K. A Practical View of Software Measurement and Implementation Experiences Within Motorola, IEEE Transactions on Software Engineering, Vol. 18, No. 11, November 1992, pp. 998-1010. Feslchow, A.R., et al Developing a Vision of the Future, Florence, A. Capability Maturity Model Level 4 Managed & Level 5 Optimizing, Proceedings of 1999 SEPG Conference, Atlanta, GA, 1999. Harvey, D. Software Metrics in Use, Proceedings of 1999 SEPG Conference, Atlanta, GA, 1999. Humphrey, W.S. A Discipline for Software Engineering, Addison-Wesley, Reading, MA, 1989. Natwick, G. Goal-Driven Metric Development, Paulk, M.C. Practices of High Maturity Organizations, Purcell, L. A Customer Quality Focused Approach to Achieving SEI Level 4 (The Managed Level), Proceedings of 1999 SEPG Conference, Atlanta, GA, 1999. Schneidewind, N.F. Methodology for Validating Software Metrics, IEEE Transactions on Software Engineering, Vol. 19, No. 5, May 1992, pp. 410-422. 434