Introduction to the Transaction Processing Performance Council (TPC)

Similar documents
TPC Pricing Specification. Standard Specification Version 2.3.0

TPC Pricing Specification. Standard Specification Version December 2018

Oracle s Hyperion System 9 Strategic Finance

Approaches to Reduce Your Application and OLTP Costs

Price and the TPC. TPCTC 2010 Singapore. IBM Systems & Technology Group

Cooperative Software: ETI Data System Library Products Allow Incompatible Systems to Exchange Information

Oracle Performance on Oracle Database Appliance. Benchmark Report August 2012

KnowledgeSTUDIO. Advanced Modeling for Better Decisions. Data Preparation, Data Profiling and Exploration

DAT 100 Microsoft s s Data Platform Vision

Syslog Technologies Innovative Thoughts

data sheet ORACLE ENTERPRISE PLANNING AND BUDGETING 11i

PEOPLESOFT ENTERPRISE GLOBAL PAYROLL 8.9 (FRANCE) USING ORACLE9i ON A HEWLETT-PACKARD INTEGRITY rx8620

INSTRUCTIONS Column A: Column B: Mandatory Preferred Optional Information Column C: Column D: Column F:

Oracle Financial Services Revenue Management and Billing V2.3 Performance Stress Test on Exalogic X3-2 & Exadata X3-2

Rural Electrification of Corporation Limited. CORE-IV, Scope Complex, 7 Lodi Road, New Delhi

LOWERING MAINFRAME TCO THROUGH ziip SPECIALTY ENGINE EXPLOITATION

Oracle Knowledge Analytics User Guide

QCD Financial Suite. Information Sheet Version 2.0

IBM Cognos Controller

Acme Software, Inc. Ajax Milestone Delivery Plan. Implementation Roadmap. Revision /14/05. CxSample_ImplementationRoadmap.

A Conditional Probability Model for Vertical Multithreaded Architectures

Using Software Measurement in SLAs:

Overview and Frequently Asked Questions

Make strategic decisions on future income, market value and risk

Characteristics of a Robust Process

WorkloadWisdom Storage performance analytics for comprehensive workload insight

E-BUSINESS SUITE APPLICATIONS R12 (RUP 4) LARGE/EXTRA-LARGE PAYROLL (BATCH) BENCHMARK -

E-BUSINESS SUITE APPLICATIONS R12 (12.1.2) LARGE/EXTRA-LARGE PAYROLL (BATCH) BENCHMARK -

Solution Architect with 18 years experience in business, visual production and technology. AIIM Certified Enterprise Content Management Practitioner

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT)

PEOPLESOFT ENTERPRISE GLOBAL PAYROLL 8.9 (FRANCE) USING ORACLE9i ON AN IBM

Convergent Charging & Billing Training Programs. Catalog of Course Descriptions

<Insert Picture Here> Smart Reporting in E-Business Suite Financials Release 12.1

Conducting a SharePoint Assessment

The Art of Building a Good Benchmark. Karl Huppler TPC Chair August 24, 2009

Performance Baseline of Hitachi Data Systems UCP for Oracle

ECLIPSE 2012 Performance Benchmark and Profiling. August 2012

COURSE RECOMMENDATION FOR MES

IT Service Management with System Center Service Manager

Load DynamiX Enterprise 5.2

Operational Insight for MDM

Introduction to IBM Information Management

Using the IBM Cloud App Management Database Load Projections Spreadsheet

E-Voting for Cooperative Elections. Elections and E-Voting: Legal Issues and How to Do It CEOs Under $2M SIG Meeting

Course 3. Software Quality Assurance & Software Quality Models. S. Motogna - Software Quality

Optim. Enterprise Data Management Strategies to Support Upgrade & Migration Projects IBM Corporation

Making BI Easier An Introduction to Vectorwise

Oracle PaaS and IaaS Universal Credits Service Descriptions

Sizing SAP Central Process Scheduling 8.0 by Redwood

Microsoft Dynamics AX 2012 Day in the life benchmark summary

CSI Authorization Auditor 2016

IBM Tivoli Monitoring

Sizing SAP Quality Issue Management, Version 1.0

Oracle Performance on Sun Server X4170 with Violin Memory Array Benchmark Report June 2012

SOX 404 & IT Controls

<Insert Picture Here> Using Oracle's BI Publisher for your JD Edwards EnterpriseOne reporting

RODOD Performance Test on Exalogic and Exadata Engineered Systems

41880 Introduction to Hyperion Financial Management. Mike Malwitz Director Product Strategy Oracle Enterprise Performance Management

Session 2.9: Tivoli Process Managers

IT Service Management with System Center Service Manager

IT Service Management with System Center Service Manager

PMP MOCK EXAMS BASED ON PMBOK 5TH EDITION

Corrective and Preventive Action System (CAPA) Product & Services Bundle for Winchester s adwatch CAPA TM. Life Sciences Companies

Legal Aspects of Software Agreements with Large Vendors. Robert Scott Scott & Scott, LLP

IBM Oracle ICC. IBM Power Systems and IBM FlashSystem Flash Storage Performance for Oracle Applications

Oracle on Google Cloud Platform: Pitfalls, Real Options & Best Practices

Software Licensing. March 18, Report City Auditor: Jed Johnson, CIA, CGAP. Major Contributor: Alexander Proza, CISA

Global Data Synchronization for WebSphere Product Center V1.2 for AIX and Solaris Provides a strategic solution for retailers and suppliers

USERS GUIDE. Change Management. Service Management and ServiceNow SERVICE EXCELLENCE SUITE

CHARTER OF THE AUDIT COMMITTEE OF THE BOARD OF DIRECTORS OF ISRAMCO, INC.

Successful Purging & Archiving In JD Edwards with ARCTOOLS

Software Engineering (CSC 4350/6350) Rao Casturi

10 Ways Oracle Cloud Is Better Than AWS

Crowe Activity Review System

Pure Data System for Analytics Growth on Demand (GoD)

IBM xseries 430. Versatile, scalable workload management. Provides unmatched flexibility with an Intel architecture and open systems foundation

LabOptimize LIMS. Laboratory Tracking Platform

Atlant s atwatch CAPA TM. Corrective and Preventive Action System (CAPA) Product & Services Bundle for

Information Technology Estimation & Measurement

Internal Control and the Computerised Information System (CIS) Environment. CA A. Rafeq, FCA

Salesforce Knowledge. Overview and Best Practices

Gartner RPE2 Methodology Overview

DKSystems DKHelpDesk Software

IBM Cognos 10.2 BI Demo

SharePoint Administrator

Research & Benchmarking Services

Attachment B Statement of Work

Overview: Status Reports/Dashboards provide program leadership and governance with updates on program progress, and strategic program risks/issues.

Automatically Find and Fix Insecure Database settings with Oracle Management Cloud PRO4284

openmdm Working Group charter

ORACLE S PEOPLESOFT HRMS 9.1 FP2 SELF-SERVICE

ORACLE S PEOPLESOFT HRMS 9.1 FP2 PAYROLL USING ORACLE DB FOR ORACLE SOLARIS (UNICODE) ON AN ORACLE S SPARC T4-4

System i Navigator family applications

PEOPLESOFT 8.8 GLOBAL CONSOLIDATIONS

SAP Fieldglass White Paper ESSENTIAL QUESTIONS TO INCLUDE IN A VENDOR MANAGEMENT SYSTEM RFP

I INTRODUCTION A. REVIEW OBJECTIVES AND SCOPE

Simplified and Enhanced Financial Reporting and Analytics A powerful, feature-laden, reporting tool and an alternative to PS/nVision

High-Volume Web Site Performance Simulator for WebSphere

Unlock the Power of IBS with Innovative New Products and Features. IBS Product Management Team April 2017

SAP Public Budget Formulation 8.1

Transcription:

Introduction to the Transaction Processing Performance Council (TPC) Robert M. Lane Performance, Availability & Architecture Engineering Sun Microsystems, Inc.

Agenda Purpose of the TPC Membership Organization Chart History Benchmarks Audits and Full Disclosure Reports Challenges Benchmark Development Process Questions and Answers

Purpose of the TPC The TPC is a non-profit corporation founded to define vendor-neutral transaction processing benchmarks and to disseminate objective, verifiable performance data to the industry. Refinement of existing benchmarks Development of new benchmarks Publication of benchmark results Promotion of the TPC model and results Resolution of disputes and challenges

Path to Accomplish TPC s Purpose Refinement of existing benchmarks Development of new benchmarks Publication of benchmark results Promotion of the TPC model and results Resolution of disputes and challenges

Membership

Full Members Influence Direction Meetings Face-to-face: five or six annually Committee and subcommittee conference calls Vote General Council Subcommittees Budget Benchmarks Revisions New Election to standing committees

Organization Chart

History

Benchmarks 3 active benchmarks TPC-C: Online transaction processing (OLTP) TPC-H: Decision support for ad hoc queries TPC-App (application server benchmark) 2 benchmarks in development TPC-E (OLTP) TPC-DS (decision support) Specification rather than kernel based Costly and time consuming (months)

Audits and Full Disclosure Reports All benchmarks are audited by TPC certified auditors for compliance and correctness. Disclosures Executive Summary (3-5 pages) Full Disclosure Reports (100 s of pages) Two types of publication Leading primary metric (throughput) Leading price/performance

Typical TPC-C Configuration Hardware Software Emulated User Load Driver System Remote Terminal Emulator (RTE) T er m. L A N Response Time measured here Presentation Services Client C / S L A N TPC-C application + Txn Monitor and/or database RPC library e.g., COM+, Tuxedo, ODBC Database Functions Databa se Server... TPC-C application (stored procedures) + Database engine e.g., SQL Server, Oracle, DB2

TPC-C Workflow 1 Select txn from menu: 1. New-Order 45% 2. Payment 43% 3. Order-Status 4% 4. Delivery 4% 5. Stock-Level 4% 2 Input screen 3 Output screen Measure menu Response Time Keying time Measure txn Response Time Think time Cycle Time Decomposition (typical values, in seconds, for weighted average txn) Menu = 0.11 Keying = 5.5 Txn RT = 0.43 Think = 7.5 Go back to 1 Avg. cycle time = 13.5

TPC-C Rules of Thumb 1.2 tpmc per User/terminal (maximum) 10 terminals per warehouse (fixed) 65-70 MB/tpmC priced disk capacity (minimum) ~ 0.5 physical IOs/sec/tpmC (typical) 250-700 KB main memory/tpmc (how much $ do you have? Price vs. Performance trade off. Power will factor in here.)

FDR: Example TPC-C Equipment

TPC-H Execution Rules High level overview (timed and un-timed portions of the benchmark) Preparatio n Load Power 1 Throughput 1 Power 2 Throughput 2 Load is timed Load Metric Slower of first and second Power Power Metric Throughput Throughput Metric

Power and Throughput Test Details

FDR: Example TPC-H Equipment Processors: 72 UltraSPARC IV+ 1500 MHz processors on 72 chips, 144 cores, 144 threads Memory: 288GB memory Disks: 108 StorEdge 3510 FC Arrays (12x73.4GB) 1 SE6120 (14x73.4GB) 4 S1 (3x73.4GB) Total Storage: 97,034.8 (in this calculation one GB is defined as 1024*1024*1024 bytes)

FDR: Lesson Large, complex configurations tend to be benchmarked. Benchmark equipment tends to be dominated by the disk subsystems. Memory and disks subsystems play important roles in TPC benchmarks. They also consume significant power. Discounts in the FDR s reflect pricing that any customer should be able to get. The purpose of TPC benchmarks is to provide relevant, objective information to industry users. To achieve that purpose, publication of a TPC benchmark requires pricing that: 1. Is no lower than what would be quoted to Any customer from the date of publication of the FDR. 2. Is actively used by the vendor in the market segment that the pricing models or represents (e.g., small business customers, or large corporations, depending on the type of system being priced). 3. A significant number of customers in the market segment that the pricing models or represents would plausibly receive in a purchase agreement. Failure to obtain pricing shown in the FDR should be reported to pricing@tpc.org.

Challenges Compliance 3.3.5.1 If the TAB finds that a Result failed to satisfy one or more specification requirements, the TAB will recommend to the council that either: (1) the Result has an insignificant deviation from the specification or (2) the Result is non-compliant. 3.3.5.2 Non-compliance is recommended to the council if and only if the TAB finds that at least one of the following conditions is applicable: Failure to satisfy one or more requirements of the specification that results in incorrect operation of the functions in the business environment the benchmark represents (e.g. Transparency, ACID) regardless of the impact on the primary metrics. Failure to meet any of the following items: Availability, Orderability, Clause 0.2, and requirements applied to any Numerical Quantities listed in the Executive Summary. The aggregate effect of one or more violations results in more than a 2% difference in price/performance or performance metrics. There is an excessive number of clauses violated even though the aggregate difference in price/performance or performance primary metrics is less than or equal to 2%. A violation against the same clause language has been voted twice before for the same Test Sponsor within the two year period prior to the result s submission date. Compliance challenges are heard by the TAB and then affirmed or rejected by the General Council. Interpretation of some rules may seem counter-intuitive or be inconsistently applied. Slippery slope phenomenon: Can my company gain competitive advantage or is the behavior so egregious the door should be shut? Is the genie already out of the bottle?

Challenges Fair Use When Results are used in publicity, the use is expected to adhere to basic standards of fidelity, candor, and due diligence, the qualities that together add up to, and define, Fair Use of Results. Fidelity: Adherence to facts; accuracy Candor: Above-boardness; needful completeness Due Diligence: Care for integrity of results Legibility: Readability and clarity Generally required metrics (Primary Metrics) Throughput Price/Performance Availability Date A set of esoteric rules, some with novel interpretations, govern fair use. Fair use challenges are heard by the Steering Committee and then affirmed or rejected by the General Council.

Benchmark Development Process Lengthy, political process. TPC-DS has been in development for 6 years. TPC-E has been in development for 3 years. Balancing act: software vs. hardware, manufacturers vs. consumers, vendor vs. vendor, etc. Beta Benchmark proposal Effort to shortcut the benchmark development process Proposals with the greatest chance of success should be well-formed, based on the template of an existing benchmark, and have industry momentum. Membership in the TPC gives voting rights and the ability to influence direction.

Who Can Benchmark? Anyone who adheres to the requirements of the specification. Anyone who hires a certified TPC auditor. Anyone who publishes an Executive Summary and Full Disclosure Report to the TPC web site (http://www.tpc.org). Anyone who survives the review period and challenge process. Generally, the system manufacturer and database software vendor sponsor the benchmark. They are usually, but not always, members of the TPC.

http://www.tpc.org Benchmark Results Benchmark Specifications Benchmark Status Reports Bylaws and Policies Historical Documents and Articles Membership Application If you want the TPC to incorporate a power metric into its benchmarks, you should become active in the TPC.

Issues to Consider Primary vs. Secondary Metrics Role of memory in performance (non-linear) and power consumption Role of disk subsystems in performance (queueing) and power consumption A power metric is required that works on uniprocessor, behemoth SMP, and large clustered systems so that fair accurate comparisons can be made. A metric should push technology forward and not freeze it at a point in time. It should encourage rather than inhibit innovation. A metric should be robust and not require tinkering. Vendors seek to exploit loopholes in specifications to their advantage as demonstrated by the TPC challenge process. The practice is for a group to slip to the lowest accepted practice. How might vendors offload benchmark work to avoid metric penalty? What unnatural undesirable benchmark configurations might result from a given metric?

Questions and Answers