How Apdex Helps Performance Management Benchmarks

Size: px
Start display at page:

Download "How Apdex Helps Performance Management Benchmarks"

Transcription

1 Peter Sevcik and NetForecast, Inc., All rights reserved. How Apdex Helps Performance Management Benchmarks Apdex Symposium 2008 December 9, 2008 Apdex Symposium, Las Vegas CMG Session 308 Peter Sevcik NetForecast, Inc. 955 Emerson Drive Charlottesville, VA Voice, Video and Data Application Performance Experts Outline Performance Management Model Benchmarks Verify Model How Apdex Improves Benchmarks : State of the Art Slide 2

2 Performance Management Defined Performance Management (PM): A holistic approach to managing application performance that integrates all of your organizational resources with one overarching goal to deliver application performance levels that meet the needs of your business Slide 3 PM Layers Corporate Management People Performance Management (PM) Layers Actions Information Process Requirements Data Product Users Infrastructure Corporate Assets Slide 4

3 Filling in the Layers People Perform Essential Support the People Actions Information Optimal Tools Support and People Requirements Data Tools Slide 5 Define the Strategy First and Architecture Second Strategy Correct Way to Build Tools Architecture Most tool vendors ignore or Don t worry it is all automatic Most Enterp prises Are on This Path Slide 6

4 Performance Management Model Understand Link PM Measure Communicate Slide 7 People Is your staff focused on the correct goals? Understand Know your applications, users, and requirements Measure Properly measure key aspects of application performance Communicate Provide relevant performance reports to management Link Show specific businessperformance links Link Understand Continual Service Improvement Communicate Measur re Slide 8

5 Peter Sevcik and NetForecast, Inc., All rights reserved. Process Do you have proper procedures in place? Incident Turn events into actionable alarms with good diagnostic capability Goal is to restore normal service operations as quickly as possible Availability Fundamental reporting of system health Historic data determines trends Capacity Match costs to business needs by adding or removing resources What if analysis and planning are key Service Assurance Service levels are tailored to meet business needs Continually monitor and coordinate corrective action Service Assurance Management Capacity Management Availability Management Incident Management Slide 9 Outline Performance Management Model Benchmarks Verify Model How Apdex Improves Benchmarks : State of the Art Slide 10

6 NetForecast PM Benchmarks Second annual survey of enterprises to characterize their current APM practices and assess results Surveys completed in February 2007 and July 2008 Sample size of approximately 300 enterprises both years Survey had questions on performance management methodology along with effectiveness metrics How problems are found Time to solve problems Performance meeting business needs Availability Response time Benchmark and effectiveness scores were determined for each enterprise Slide 11 Results ss Metrics Composite Performance Effectivene (0=poor, 10=excellent) % Gain Gain is the slope of the curve. Or the return in performance delivered (vertical axis) for the investment in the benchmark (horizontal axis) NetForecast Benchmark Score (0=poor, 10=excellent) k Link Understand Communicate Slide 12 Me easure

7 Results ss Metrics Composite Performance Effectivene (0=poor, 10=excellent) % Gain NetForecast Benchmark Score (0=poor, 10=excellent) Slide 13 The PM Model Works Strategy 51% Gain 24% Gain Tools Architecture Slide 14

8 Peter Sevcik and NetForecast, Inc., All rights reserved. Outline Performance Management Model Benchmarks Verify Model How Apdex Improves Benchmarks : State of the Art Slide 15 Apdex Defined Apdex is a numerical measure of user satisfaction with the performance of enterprise applications It defines a method that converts many measurements into one number Uniform 0-1 scale 0 = no users satisfied 1 = all users satisfied Standardized method that is a comparable metric across Applications, Measurement approaches, and Enterprises Slide 16

9 How Apdex Works 2 Report Group: Application User Group Time Period Existing Task Response Time Measurement e e Samples 3 1. Define T for the application T = Target time (satisfied-tolerating threshold) F = 4T (tolerating-frustrated threshold) 2. Define a Report Group 3. Extract data set from existing measurements 4. Count number of samples in three zones 5. Calculate the Apdex formula 6. Display Apdex value showing T 7. Optionally display value using quality colors 7 Excellent Good 1.00 T 0.94 T 0.85 T Fair 0.70 T 5 Frustrated Poor 0.50 T 4 Tolerating 6 F Satisfied + 2 Apdex T = Total samples 0.91 [6] Tolerating Value T 1 T Satisfied 0.00 T Slide 17 Unacceptable Apdex Example Time 16% 14% Major ecommerce site ($4B annual on-line sales) North American broadband users accessing the San Francisco data center Probability of Experiencing the 12% 10% 8% 6% 4% 2% 52% Satisfied 42% Tolerating This site had an average response time of 4 seconds, so it looked like all was well But: Apdex = = Fair 6% Frustrated 0% Load Time of a Typical Business Page (sec) Slide 18

10 Apdex Methodology Gather Agree Improve Measure Gather baseline data Test Apdex Parameters 1) Work with users to set T A T 2) Work with business managers to define the service objective for A Apdex reports & trends Improve performance where needed Continual performance & process improvement Slide 19 Keys to PM Success Collaboration Dependability Link Communicate Measure Understand Assurance Capacity Availability Incident Slide 20

11 Leveraging Apdex for Success The Apdex methodology forces organizations to achieve better PM Collaboration Dependability Apdex Benefit Zone Link Communicate Measure Understand Assurance Capacity Availability Incident Apdex is a numerical measure of application performance relative to business objectives. The Apdex open standard is managed by the Apdex Alliance see apdex.org. The Apdex Alliance has more than 800 individual and 10 vendor members. Slide 21 How Apdex Helps PM Apdex dialog supports collaboration Open standard Supports dependability Selection of T and the Apdex score is the link to business needs Excellent e foundation o For meaningful SLAs Facilitates technologists and non-technologists communications Collaboration Dependability Trend analysis helps find capacity problems Defines measurement taxonomy Forces you to understand your applications and users Link Communicate Measure Understand Assurance Capacity Availability Incident Can be applied to availability analysis Discovers hard to find incidents Slide 22

12 Peter Sevcik and NetForecast, Inc., All rights reserved. Outline Performance Management Model Benchmarks Verify Model How Apdex Improves Benchmarks : State of the Art Slide 23 What Are? Understand Have a good definition of which technical parameters are important Good understanding of which applications are mission critical Good understanding of which users are more important How well the above is documented How well these understandings are distributed within the IT organization Measure How well the important technical parameters are measured How well the measurements are tracked over time Setting critical thresholds for the important parameters Formal processes to ensure that these measurements are carried out Automation for efficient data gathering and correlation Slide 24

13 What Are? (con t) Communicate Provide relevant performance reports to management Ensure that the reports a meaningful to non-it staff Measurements are communicated throughout the enterprise Distribute information on a regular basis Don t limit information thinking the recipient won t understand it Link Performance targets are agreed upon with business managers Applications monitored are confirmed to be business critical Performance targets are confirmed to be relevant to the business Application-level SLAs are in place Meet with business managers periodically to review the above Slide 25 Benchmarks Industry State of the Art 25% Percent Distribution 20% 15% 10% Overall Benchmark Medians 2007: : 5 State of the art improved 10% in a year 5% 0% Benchmark Slide 26

14 Private/Public Traffic in 2008 Median Benc chmarks Private vs. Public makes no difference All within a tight 0.3 point range 100/0 75/25 50/50 25/75 0/100 Private Private/Public Ratio Public Slide 27 Impact of Enterprise Size in 2008 Median Benc chmarks Enterprise size has a 2 point range < K 1K-10K 10K-40K >40K Number of Employees in the Enterprise Small Big Slide 28

15 Benchmarks Compared Understand Measure Communicate Link Median Benchmark Slide 29 Why Measure is Less Than Understand Contribution to the Difference Between Understand and Measure End-User Response Time Server Errors Server Query-Response Time Availability Server Utilization TCP Transaction Response Time Transactions Processed Traffic Flows by Application WAN Performance Overall shift of 2 benchmark points from 6 to 4 on previous slide Bandwidth Utilization Benchmark Slide 30

16 Real Company Examples Understand Individual Enterprise Profiles Are Diverse Government Consumer Website Computers Telecom Eq. Hospitality Pharma Healthcare Bank Airline Measure Communicate Link Benchmark Slide 31 NetForecast Benchmark Service Strategy Tools Benchmark how well your enterprise is delivering the and essential Rank your benchmark scores against industry norms (database of more than 600 enterprises) Gap Analysis shows where you Best Practices and excel or are deficient Workshop meeting that defines your successful performance management strategy Slide 32

17 Thank You More information is available at: Free articles and reports on performance measurement analysis and management measurement, analysis, and management are available at