International Conference of Software Quality Assurance. sqadays.com. The Backbone of the Performance Engineering Process

Similar documents
Performance requirements should to be tracked from system's inception through its whole lifecycle including design, development, testing, operations,

issue 11 The Magazine for Agile Developers and Agile Testers Requirements in Agile Projects August 2012

Context-Driven Performance Testing

Software Architecture and Engineering Requirements Elicitation Peter Müller

Software Architecture and Engineering Requirements Elicitation Peter Müller

Software Development Life Cycle:

Top six performance challenges in managing microservices in a hybrid cloud

Are Your Cloud Applications Performance-Ready?

Identifying Application Performance Risk

Cost of Changing the Activities in SDLC. Minimum of Cost at this level. code debuging unit test integration. Activity

5 Steps to Adopting Agile IT Infrastructure Monitoring Necessary for a Customer-Driven World

Software Engineering

HP Quality Center 10 Overview

Factors of Cloud Success

The Economic Benefits of Puppet Enterprise

Bed Management Solution (BMS)

Cost Optimization for Cloud-Based Engineering Simulation Using ANSYS Enterprise Cloud

version NDIA CMMI Conf 3.5 SE Tutorial RE - 1

Using SAP with HP Virtualization and Partitioning

<Insert Picture Here> Coherence as a solution for ecommerce Capacity Absorption

Improve Alerting Accuracy

Software Requirements and Organizational Culture: 10 Lessons Learned

Measuring Software Reliability

Capacity Management - Telling the story

Performance Engineering

IT Architectural Principles for Designing a Dynamic IT Organization

Application Migration Patterns for the Service Oriented Cloud

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2

NetApp Services Viewpoint. How to Design Storage Services Like a Service Provider

ENABLING DEVOPS, CONTINUOUS DELIVERY AND QA WITH INCREDIBUILD

Managing Multi Device Deployment. The Five Essential Factors

NCOVER. ROI Analysis for. Using NCover. NCover P.O. Box 9298 Greenville, SC T F

A Systematic Approach to Performance Evaluation

SPEC RG CLOUD WG Telco. George Kousiouris, Athanasia Evangelinou 15/4/2014

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT)

Architecture. By Glib Kutepov Fraunhofer IESE

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Using Systems Capacity Data for Business Intelligence

ITIL Capacity Management for the Newbie For those new to system z. Charles Johnson Principal Consultant

Energy Advisor. Introduction. Energy Advisor. Product Data Sheet October 2014

Interactive presentation. Application methodology

Requirements Elicitation

White Paper. SAS IT Intelligence. Balancing enterprise strategy, business objectives, IT enablement and costs

Benefits. + + Consistent quality to recipe specification. + + Increase asset utilization and operational efficiency

SAVE MAINFRAME COSTS ZIIP YOUR NATURAL APPS

Trasformare il Business con Soluzioni Cloud

Parallels Remote Application Server and Microsoft Azure. Scalability and Cost of Using RAS with Azure

A Review Paper on Software Testing

Domain Understanding and Requirements Elicitation (2)

Testing 2. Testing: Agenda. for Systems Validation. Testing for Systems Validation CONCEPT HEIDELBERG

Address system-on-chip development challenges with enterprise verification management.

GiftCard.com Saves 25 Percent in Seasonal Compute Costs with Help from New Relic

Rational Unified Process (RUP) in e-business Development

The Challenges of Operating a Computing Cloud and Charging for its Use

Infor LN Minimum hardware requirements. Sizing Documentation

Sistemi ICT per il Business Networking

How Apdex Helps Performance Management Benchmarks

Getting started with load & performance testing

PRO: Designing and Developing Microsoft SharePoint Server 2010 Applications

IBM ICE (Innovation Centre for Education) Welcome to: Unit 1 Overview of delivery models in Cloud Computing. Copyright IBM Corporation

1. Can you explain the PDCA cycle and where testing fits in?

7. Model based software architecture

Agile versus? Architecture

HP Cloud Maps for rapid provisioning of infrastructure and applications

DYNAMIC FULFILLMENT (DF) The Answer To Omni-channel Retailing s Fulfillment Challenges. Who s This For?

Introduction to Information Security Prof. V. Kamakoti Department of Computer Science and Engineering Indian Institute of Technology, Madras

Lecture 1. In practice, most large systems are developed using a. A software process model is an abstract representation

WorkloadWisdom Storage performance analytics for comprehensive workload insight

Translate Integration Imperative into a solution Framework. A Solution Framework. August 1 st, Mumbai By Dharanibalan Gurunathan

GAMP5 Validation for Dynamics 365

Product Brief SysTrack VMP

Business Insight at the Speed of Thought

PRES The Effects of Software Process Maturity on Software Development Effort

Software Processes 1

Integration and Testing

Drive Predictability with Visual Studio Team System 2008

ROI: Storage Performance Analytics

[Header]: Demystifying Oracle Bare Metal Cloud Services

Loading. Do s and Don ts of App Development. 15 tips for the business and development team when developing or improving mobile or web applications

Component-Based Development in Geographically Dispersed Teams

Transforming Business Needs into Business Value. Path to Agility May 2013

The Ultimate Guide to B2B Customer Support

Introduction to Agile Life Cycles. CSCI 5828: Foundations of Software Engineering Lecture 07 09/13/2016

St Louis CMG Boris Zibitsker, PhD

ATAM. Architecture Trade-off Analysis Method with case study. Bart Venckeleer, inno.com

On various testing topics: Integration, large systems, shifting to left, current test ideas, DevOps

Successful Service Virtualization

[Name] [ ID] [Contact Number]

A Framework Approach to Ensuring Application Recovery Readiness. White Paper

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK

DEVELOP QUALITY CHARACTERISTICS BASED QUALITY EVALUATION PROCESS FOR READY TO USE SOFTWARE PRODUCTS

What are requirements? Basics of Requirement Engineering. Definition of a Stakeholder. Stated Vs. Real Requirements. Stated Vs.

It will also enable you to manage the expectations of your clients or management, as they will know exactly what to expect.

Presented at the 2009 ISPA/SCEA Joint Annual Conference and Training Workshop - Making the Case for SOA Arlene F.

Brochure. Application Lifecycle Management. Accelerate Your Business. Micro Focus Application Lifecycle Management Software

CORE APPLICATIONS ANALYSIS OF BUSINESS-CRITICAL ADABAS & NATURAL

Defining Composite Critical Scenarios for the Development of Large Scale System Architecture Using an SEI's ADDbased. Aldo Dagnino

Application Performance Monitoring (APM) Technical Whitepaper

IBM Rational RequisitePro

Contents About This Guide... 5 Upgrade Overview... 5 Examining Your Upgrade Criteria... 7 Upgrade Best Practices... 8

Transcription:

Software quality assurance days International Conference of Software Quality Assurance sqadays.com Alexander Podelko Oracle. Stamford, CT, USA Minsk. May 25 26, Performance 2018 Requirements 1 Introduction The topic is more complicated than it looks Performance requirements are supposed to be tracked through the whole system lifecycle Each group of stakeholders has its own view and terminology An overview of existing issues and an attempt to create a holistic view Disclaimer: The views expressed here are my personal views only and do not necessarily represent those of my current or previous employers. All brands and trademarks mentioned are the property of their owners. 2 1

Requirements Architecture and Design Construction / Implementation Testing SDLC Deployment and Maintenance Performance Eng Life Cycle Design for Performance and Performance Modeling Unit Performance Tests and Code Optimization Performance Testing Performance Monitoring and Capacity Management 3 Agenda Metrics Elicitation Analysis and Specification Validation and Verification 4 2

High-Level View of System Users Software Hardware 5 Business Performance Requirements For today's distributed business systems Throughput Response / processing times All are important 6 3

Throughput The rate at which incoming requests are completed - Usually we are interested in a steady mode Straightforward for homogeneous workloads - Not so easy for mixed workloads: mix ratio can change with time Varies with time - Typical hour, peak hour, average, etc. 7 Number of Users Number of users by itself doesn't define throughput - Without defining what each user is doing and how intensely - 500 users running one short query each minute: throughput 30,000 queries per hour - 500 users running one short query each hour: throughput 500 queries per hour - Same 500 users, 60X difference between loads 8 4

Concurrency Number of simultaneous users or threads - Number of active users Take resources even if doing nothing Number of named users - Rather a data-related metric Number of really concurrent users - Number of requests in the system - Not a end-user performance metric 9 Response Times How fast requests are processed Depends on context - 30 minutes may be excellent for a large batch job Depends on workload - Conditions should be defined Aggregate metrics usually used - Average, percentiles, etc. 10 5

Context All performance metrics depend on context like: - Volume of data - Hardware resources provided - Functionality included in the system Functionality is added gradually in agile methodologies 11 Important for IT Internal (Technological) Requirements Derived from business and usability requirements - During design and development Resources Scalability 12 6

Resources CPU, I/O, memory, and network Resource Utilization - Related to a particular configuration - Often generic policies like CPU below 70% Relative values (in percents) are not useful if configuration is not given - Commercial Off-the-Shelf (COTS) software - Virtual environments 13 Resources: Absolute Values Absolute values - # of instructions, I/O per transaction Seen mainly in modeling - MIPS in mainframe world Importance increases again with the trends of virtualization, cloud computing, and SOA - VMware: CPU usage in MHz - Microsoft: Megacycles - Amazon: EC2 Compute Units (ECU) 14 7

Scalability Ability of the system to meet performance requirements as the demand increases Increasing # of users, transaction volumes, data sizes, new workloads, etc. Performance requirements as a function, for example, of load or data and configuration - No free / ideal scalability 15 Agenda Metrics Elicitation Analysis and Specification Validation and Verification 16 8

IEEE SWEBOK IEEE Software Engineering Book of Knowledge defines four stages for requirements: - Elicitation Where come from and how to collect them - Analysis Classify / Elaborate / Negotiate - Specification Production of a document - Validation 17 Where do performance requirements come from? Business Usability Technology 18 9

Business Requirements Comes from the business, may be caught before design starts - Number of orders per hour The main trap is to immediately link them to a specific design and technology thus limiting the number of available choices - For example, it may be one page per order or a sequence of two dozen screens - Each of the two dozen may be saved separately or all at the end 19 Requirements Elicitation Final requirements should be quantitative and measurable Business people know what the system should do and may provide some information - They are not performance experts Document real business requirements in the form they are available - Then elaborate them into quantitative and measurable 20 10

Goals vs. Requirements Most response times "requirements" are goals - Missing them won't prevent deploying the system For response times, the difference between goals and requirements may be large - For many web applications goals are two-five seconds and requirements somewhere between eight seconds and one minute 21 Determining Specific Requirements It depends Approach the subject from different points of view Just to illustrate here are 10 methods suggested by Peter Sevcik to find T in APDEX - T is threshold between satisfied and tolerating users; should be strongly correlated with the response time goal 22 11

Default value (4 sec) Empirical data User behavior model (# of elements/task repetitiveness) Outside references Observing users Methods to Find T (by Peter Sevcik) Controlled perf experiment Best time multiple Calculate from frustration threshold F (F=4T in APDEX) Interview stakeholders Mathematical inflection point 23 Suggested Approach Peter Sevcik suggests to use several of these methods: if all come approximately to the same number it will be T A similar approach can be used for performance requirements: use several methods to get the numbers you get goal/requirement if they are close - Investigate / sort out if they differ significantly 24 12

Usability Requirements Many researchers agree that - Users lose focus if response times are more than 8 to 10 seconds - Making response times faster than one to two seconds doesn't help productivity much Sometimes linked closely to business requirements - Make sure that response times are not worse than competitor s 25 Response Times: Review of Research In 1968 Robert Miller defined three threshold levels of human attention Instantaneous 0.1-0.2 seconds Free interaction 1-5 seconds Focus on dialog 5-10 seconds 26 13

Instantaneous Response Time Users feel that they directly manipulate User Interface (UI) For example, between typing a symbol and its appearance on the screen 0.1-0.2 seconds Often beyond the reach of application developers - System/UI libraries, client-side 27 Free Interaction Notice delay, but "feel" the computer is "working" Earlier researchers reported 1-2 sec - Simple terminal interface For problem solving tasks no performance degradation up to 5 sec - Depends on the number of elements and repetitiveness of the task 28 14

Does It Change with Time? 2009 Forrester research suggests 2 second response time, in 2006 similar research suggested 4 seconds The approach is often questioned: they just ask. It is known that user perception of time may be misleading What page are we talking about? 29 Web Performance Insights Tammy Everts' book Time is Money: The Business Value of Web Performance WPO Stats https://wpostats.com/ HTTP Archive https://httparchive.org/ 30 15

Business Web Metrics bounce rate cart size conversions revenue time on site page views user satisfaction user retention organic search traffic brand perception productivity bandwidth/cdn savings 31 Focus on Dialog Users are focused on the task: 5-10 sec Half of users abandon Web pages after 8.5 sec - Peter Bickford, 1997-2 min delay after 27 quick interactions - Watch cursor kept users 20 sec, animated cursor 1 min, progress bar until the end Users should reorient themselves after a delay above the threshold 32 16

Agenda Metrics Elicitation Analysis and Specification Validation and Verification 33 Technological Requirements Comes from the chosen design and used technology - We call ten web services sequentially to show a page within 3 sec. It translates into requirements of 200-250 ms for each web service - Resource utilization requirements 34 17

Analysis and Modeling Final requirements are elaborated from business requirements by applying usability and technological requirements Requirements traceability - Where it came from Input for Software Performance Engineering - For example, defining service / stored procedure response times by its share in the end-to-end performance budget 35 Documenting Requirements Requirements / Architect s vocabulary Quality Attributes - Part of Nonfunctional Requirements Approaches - Text - Quality Attribute Scenarios (SEI) - Planguage 36 18

Quality Attribute Scenarios QA scenario defines: - Source [users] - Stimulus [workload] - Environment [system state] - Artifact [part of the system] - Response [expected actions] - Response Measure [response times] 37 Planguage Tag: unique identifier Gist: brief description Scale: unit of measure Meter: how to measure Minimum / Plan / Stretch/ Wish : levels to attain Past / Record / Trend 38 19

What Metrics to Use? Average Max Percentiles (X% below Y sec) Median Typical etc. 39 The Issue SLA (Service Level Agreement) - "99.5% of all transactions should have a response time less than five seconds" What happens with the rest 0.5%? - All 6-7 seconds - All failed/timeout Add different types of transactions, different input data, different user locations, etc. 40 20

Observability Four different viewpoints - Management - Engineering - QA Testing - Operations Ideal would be different views of the same performance database Reality is a mess of disjoint tools 41 May Need a Few Metrics Combination of percentile and availability metric works in many cases - 97% below 5 sec, less than 1% failed/timeout An example of another metric: - Apdex (Application Performance Index) - Objective user satisfaction metric - A number between 0 and 1-0 no users satisfied, 1 all users satisfied 42 21

Agenda Metrics Elicitation Analysis and Specification Validation and Verification 43 Requirements Validation Making sure that requirements are valid - Quite often used to mean checking against test results (instead of verification) Checking against different sources Reviews, modeling, prototyping, etc. Iterative process Tracing - Tracing back to the original requirement 44 22

Requirements Verification Checking if the system performs according to the requirements Both requirements and results should use the same aggregates to be compared Many tools measure only server time (or server and network) - End user time may differ significantly, especially for rich web clients or thick clients Both in load testing and production! 45 Verification Issue Let s consider the following example Response time requirement is 99% below 5 sec 99% 3-5 sec, 1% 5-8 sec - Looks like a minor performance issue 99% 3-5 sec, 1% failed or had strangely high response times (more than 30 sec) - Looks like a bug or serious performance issue 46 23

Requirements Verification: Performance vs. Bug Two completely different cases - Performance issue: business decision, cost vs. response time trade off. - Bug exposed under load: should be traced down first to make decision 47 The equipment is not operating as expected, and therefore there is a danger that it can operate with even wider deviation in this unexpected and not thoroughly understood way. The fact that this danger did not lead to a catastrophe before is no guarantee that it will not the next time, unless it is completely understood. Dr. Richard Feynman Roger Commission Report on the Challenger space shuttle accident 48 24

Summary Specify performance requirements at the beginning of any project What to specify depends on the system - Quantitative and measurable in the end Elaborate and verify requirements throughout Development Testing Production 49 Questions? Alexander Podelko apodelko@yahoo.com @apodelko 50 25