Bed Management Solution (BMS)

Similar documents
New Solution Deployment: Best Practices White Paper

SYNTHETIC ACTIVE MONITORING. Copyright 2015 TestPoint All Rights Reserved

Bed Management Solution (BMS)

Software Performance Test Plan Template

The Business Process Environment

Seaglex Departmental Web Information Management Solutions

V&V = the Verification and Validation of Deliverables

Services Guide April The following is a description of the services offered by PriorIT Consulting, LLC.

Compiere ERP Starter Kit. Prepared by Tenth Planet

CEM Capacity Planning Analysis

Application Performance Monitoring (APM) Technical Whitepaper

AppDynamics Launches Business iq

Request for Proposal for Implementation of ERP and Webbased ERP- like Solutions

Cisco Unified Workforce Optimization for Cisco Unified Contact Center Express 9.0

R12 Upgrade versus Re-implementation and Impact Analysis

5 Things to Know About Network Monitoring in a Cloud-Centric World

A technical discussion of performance and availability December IBM Tivoli Monitoring solutions for performance and availability

Implementing a Data Warehouse with Microsoft SQL Server

Using Application Response to Monitor Microsoft Outlook

R12 Upgrade Vs. Re-implementation

ALM120 Application Lifecycle Management 12.x Essentials

Introduction to IBM Insight for Oracle and SAP

Session-2: Deep Drive into Non Functional Requirements (NFRs)

SharePoint Administrator

Systems Management of the SAS 9.2 Enterprise Business Intelligence Environment Gary T. Ciampa, SAS Institute Inc., Cary, NC

Features and Capabilities. Assess.

A Systematic Approach to Performance Evaluation

Analysis Services. Show Me Where It Hurts. Bill Anton Head Beaver Prime Data Intelligence, LLC

HPE Application Performance Management

Automating Your Critical Care Units

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT)

Benefits. + + Consistent quality to recipe specification. + + Increase asset utilization and operational efficiency

Child Welfare Digital Services Project Service Asset and Configuration Management Plan

WCIRBCalifornia. Job Description. Business Intelligence (BI) Engineer. Department: Summary

Axibase Warehouse Designer. Overview

This topic focuses on how to prepare a customer for support, and how to use the SAP support processes to solve your customer s problems.

Utilizing Citrix EdgeSight to Improve Windows Application Delivery. Paul Murray, Senior Systems Engineer, MSG EMEA

Get a Second Opinion: Enterprise GIS Health Checks. Matt Marino, Esri Sam Libby, Esri

State of Washington. WIC Cascades Project MIS Transfer and Implementation Scope of Work. Department of Health

Deliverable: 1.4 Software Version Control and System Configuration Management Plan

CONTINUOUS AUDITING - UPDATE. Travis S. Moser, CISA

COURSE OUTLINE: Implementing a Data Warehouse with SQL Server Implementing a Data Warehouse with SQL Server 2014

ETASS II SKILL LEVEL AND LABOR CATEGORY DESCRIPTIONS. Skill Levels

[MS10231B]: Designing a Microsoft SharePoint 2010 Infrastructure

MANAGEMENT INFORMATION SYSTEMS COURSES Student Learning Outcomes 1

Designing a Microsoft SharePoint 2010 Infrastructure

Training Catalog: Oct - Dec 2013

Enterprise Modeling to Measure, Analyze, and Optimize Your Business Processes

Guide to Modernize Your Enterprise Data Warehouse How to Migrate to a Hadoop-based Big Data Lake

Vadosity LLC Company Proprietary. All rights reserved.

"Charting the Course to Your Success!" MOC Designing a Business Intelligence Solution by Using Microsoft SQL Server 2008.

Brocade SANnav Management Portal and. Global View. Product Brief. Storage Modernization. Highlights. Brocade Fabric Vision Technology

Ch. 6: Understanding and Characterizing the Workload

How Apdex Helps Performance Management Benchmarks

Title: HP OpenView Configuration Management Overview Session #: 87 Speaker: Loic Avenel Company: HP

1- Introduction The explosion of SharePoint demands Keeping a lid on SharePoint costs Make it work for the user!...

IBM Tivoli Monitoring

Work Plan and IV&V Methodology

Windchill System Validation Technical Brief

Sample Company. Risk Assessment and Mitigation Plan

PRO: Designing and Developing Microsoft SharePoint Server 2010 Applications

2018 WTW General Industry Information Technology Compensation Survey Report - U.S.

Scalability and High Performance with MicroStrategy 10

Architecting Web Service Applications for the Enterprise

Oracle Prime Project Management Solution Set Delivered via Oracle True Cloud Method

Number: DI-IPSC-81427B Approval Date:

EXHIBIT A: STATEMENT OF WORK

SSL ClearView Reporter Data Sheet

Number: DI-IPSC-81427B Approval Date:

imvision System Manager Infrastructure Management Software

SOX 404 & IT Controls

Product Brief SysTrack VMP

Thank you, and enjoy the webinar.

Operational Concept Description (OCD)

SysTrack Workspace Analytics

SCHEDULE 20 SERVICE DOCUMENTATION

Avaya CMS capacities. Using the capacity limits. ! Important:

<Insert Picture Here> Oracle Business Process Analysis Suite: Overview & Product Strategy

SENTRON Powermanager. SENTRON Powermanager. Identifying hidden potential for energy optimization and savings. Answers for industry.

STATEMENT OF WORK. Statement of Work for Clinical Reasoning and Prediction System Assessment Page 1 of 7

QUICK FACTS. Executing and Automating Application Testing for a Healthcare Payer Organization TEKSYSTEMS GLOBAL SERVICES CUSTOMER SUCCESS STORIES

DESIGNING A MICROSOFT SHAREPOINT 2010 INFRASTRUCTURE. Course: 10231B Duration: 5 Days; Instructor-led

SHARP (SharePoint Health Assessment and Recommendation Program) Author Harsh Gautam Singh

Implementing a Service Management Architecture

CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS

"Charting the Course... Application Lifecycle Management Using Visual Studio 2010 (Agile) Course Summary

What s New with the PlantPAx Distributed Control System

Interactive presentation. Application methodology

Top six performance challenges in managing microservices in a hybrid cloud

IBM IoT Continuous Engineering on Cloud and IBM Collaborative Lifecycle Management on Cloud

SAP Public Budget Formulation 8.1

ITIL Capacity Management for the Newbie For those new to system z. Charles Johnson Principal Consultant

International Conference of Software Quality Assurance. sqadays.com. The Backbone of the Performance Engineering Process

MANAGEMENT INFORMATION SYSTEMS COURSES Student Learning Outcomes 1

Program Lifecycle Methodology Version 1.7

DEVELOPING DATA VISUALIZATION CAPABILITY

IBM Rational RequisitePro

An Esri White Paper April 2011 Esri Business Analyst Server System Design Strategies

What's New With Rational Team Concert (TM)

CTIS GSA LABOR CATEGORY DESCRIPTIONS

Transcription:

Bed Management Solution (BMS) System Performance Report October 2013 Prepared by Harris Corporation CLIN 0003AD System Performance Report Version 1.0

Creation Date Version No. Revision History Description/Comments Author(s) Reviewer(s) Review Type Issue Date 09/24/2013 1.0 Initial baseline. L. Woods 10/07/2013 This document contains information and/or data for use in support of EVEAH Bed Management Solution (BMS). Content generated and derived for this document is intended, and applicable, for both applications and project efforts. System Performance Report ii Version 1.0

Table of Contents 1 General Information... 1 1.1 Purpose... 1 1.2 Scope... 1 1.3 Roles and Responsibilities... 2 1.4 Simulated Production System Overview... 2 1.5 Acronyms and Glossary... 4 2 System Performance Measuring... 4 2.1 Benchmarks... 4 2.2 System Monitoring Tools... 4 2.3 Traffic Models... 5 3 System Performance Reporting... 5 3.1 Performance Data Collecting... 5 3.2 Performance Data Analysis... 5 3.3 System Performance Report Form... 5 3.3.1 Availability... 6 3.3.2 Response time... 8 3.3.3 Simultaneous user handling... 11 3.3.4 VAMC sites supported by the BMS application... 13 3.3.5 Business Transaction Time Distribution Graph... 15 3.3.6 Business Transaction Defect Pareto Graph... 18 4 Related Documentation... 18 System Performance Report iii Version 1.0

List of Tables Table 1 Roles and Responsibilities... 2 Table 2 Acronyms and Glossary... 4 List of Figures Figure 1 - System Overview... 3 Figure 2 - Maximum of 152 User Sessions on BMS... 6 Figure 3 - BMS Web Front Ends... 7 Figure 4 - BMS Web Front Ends... 8 Figure 5 - BMS Web Database... 9 Figure 6 - BMS Web Client Web Services... 10 Figure 7 - MDWS Server Web Services... 11 Figure 8 - BMS ServiceHost Server Web Services... 12 Figure 9 - BMS ServiceHost Web Services... 13 Figure 10 BMS ServiceHost Server Web Service... 14 Figure 11 - Business Transaction Time Distribution Graph... 15 Figure 12 - Business Transaction Time Measurements Table... 16 Figure 13- Business Transaction Time Measurements and Percentiles Table... 17 Figure 14 - Business Transaction Time Percentiles Table (Cont.)... 18 Figure 15 - Business Transaction Defect Pareto Graph... 18 Figure 16 - Business Transaction Defect Measurements Table... 18 System Performance Report iv Version 1.0

1 General Information Modernizing and enhancing the Bed Management Solution (BMS) system aligns the Department of Veteran Affairs (VA) with its Initiative 8 Enhance the Veteran Experience and Access to Healthcare (EVEAH) program by leveraging technology to enhance staff awareness of patient care status and manage patient flow. An advanced real-time system is needed to support the management of beds in VA Medical Centers (VAMCs). Improving bed management has been identified as a critical enabler of patient flow. Efficient bed management verifies maximum utilization of existing bed capacity, increases patient throughput by decreasing waiting times, and allows for a smooth transition of patients from the Emergency Department and Surgery to inpatient beds. IT facilitates efficient patient flow operations and provides reports on the performance of bed management activities. This intelligence enables VA facility and the Veterans Integrated Service Network (VISN) to track Key Performance Indicators and meet the Deputy Under Secretary for Health (DUSH) guidelines. 1.1 Purpose The purpose of the System Performance Report is to establish system performance capabilities for the BMS environment, as well as list the performance monitoring tools that can be used to gather those capabilities. These performance capabilities may consist of, but are not limited to, internal system response time per individual request, overall simultaneous request handling, connection quantity handling, expected system utilization maximums, disk storage constraints, VistA server response time and present bandwidth limitations, and hardware requirements at local workstations to access the system, which may include Central Processing Unit (CPU) utilization, memory utilization, network throughput, and disk utilization. This document, combined with the System Performance and Capacity Metrics document develops collection and publishing procedures related to system capacity and health. Initial system performance measurements, or benchmarks, are gathered during the simulated production (test) environment and the initial stages of the actual production environment. These benchmark readings are used to make adjustments to the system to improve system performance. After additional features or system changes are implemented within the BMS application, system performance measurements are collected and reported, as are available and have been defined thus far, to develop new baseline measurements. 1.2 Scope The scope of the report is to gather the system performance capabilities and to discover what hardware, network, and server demands are expected to help define minimum system requirements. All data discovered is quantified and reported, which is then used to form benchmark readings, data collection sets and traffic models. The data collection sets and traffic models are formulated into a system performance report. The performance capabilities / benchmarks establish performance measurements of the system that are used to evaluate the system after a change is implemented to the system or the application so that adjustments may be made to improve operation of the system and application. System Performance Report 1 Version 1.0

1.3 Roles and Responsibilities The following table shows the active roles, responsibilities, and assigned tasks on the BMS project. Analyst Role Configuration Manager Database Administrator Developers Functional Analyst Project Manager Process and Product Quality Assurance Program Manager Table 1 Roles and Responsibilities Responsibilities Requirements analysis, high level design and documentation. Technical analysis and documentation for other activities. Responsible for controlling and managing all artifacts produced by the project teams. Responsible for database installation and changes as a result of the development including database upgrades, migrations, and data patches (scripts that correct data problems in a given environment). Responsible for software development and unit testing of the technical solution and supporting technical documentation. Analysis of clinical workflow, terminology, and algorithm verification and creation. Executive oversight of the BMS program, advisor to the Task Order Manager and Senior client relations activities. Conducts product and process quality assessment activities. Executive oversight of the BMS program regarding contractual or financial concerns, advisor to the Task Order Manager and Senior client relations activities. Project Planner Creates and maintains project plans in MS Project 2007. Release Manager Requirements Manager Scrum Master Software Quality Assurance System Administrator Technical Architect Technical Writer Test Engineer Reviews all patch artifacts/interfaces and have the final approval in the Release Process. Oversight for requirements gathering and processing. Leads the daily scrums and act as servant leader to the scrum teams. Conducts software quality assessment activities. Responsible for the operating system administration of the server environments. Responsible for the technical solutions implemented in the patches and ensuring that all patches take into account the other work being done on BMS and any other products that interface with BMS. Provide direction and continuity for the technical solution. Develops, reviews, edits, and updates the documentation needed by projects and tasks. Responsible for verifying that the documented functionality works as intended and that the results are documented. Testing includes functional system tests, 508 tests, performance tests, and more. 1.4 Simulated Production System Overview The Simulated Production System Overview is based on the BMS System Design Document (SDD). The SDD is based on information supplied by VA regarding the architecture for the production environment within the Austin Information Technology Center (AITC). Prior to any rollout of upgrades to a VA production environment, there is a testing and acceptance process completed on an established simulated production system. This simulated production system facilitates performance and capacity testing by serving as a functional model of the environment where upgrades are subsequently deployed. System Performance Report 2 Version 1.0

The simulated production system shall include PC workstations typical of those used to access the BMS system; local servers and firewalls similar to those found at the facility; as well as web servers, application servers, and SQL servers operating under access demands simulating what is encountered when accessing Veterans Health Information Systems and Technology Architecture (VistA). The BMS System is split on several infrastructure logical levels, as shown in Figure 1. The following diagram depicts the Pre-Production and Production environments. The Development, Software Quality Assurance (SQA), and Live Quality Assurance (QA) environments will be identical. Figure 1 - System Overview System Performance Report 3 Version 1.0

1.5 Acronyms and Glossary Table 2 Acronyms and Glossary AITC APM BMS CPU DUSH EVEAH IT QA SDD SQA VA VAMC VISN VistA Term Definition Austin Information Technology Center Application Performance Management/Monitoring Bed Management System Central Processing Unit Deputy Under Secretary for Health Enhance the Veteran Experience and Access to Healthcare Information Technology Quality Assurance System Design Document Software Quality Assurance Department of Veterans Affairs Veteran Affairs Medical Center Veterans Integrated Service Network Veterans Health Information Systems and Technology Architecture 2 System Performance Measuring 2.1 Benchmarks Benchmarks or performance markers can consist of, but are not limited to, transactions per second, system CPU/disk/memory usage, and network throughput. The benchmarks or performance testing of the BMS system is developed during the testing/pre-production phase of the BMS project. Benchmark readings are developed post production and following the introduction of additional system or program features or changes to have the correct performance markers for the current system. 2.2 System Monitoring Tools System monitoring tools for gathering the information in this document are installed, administered and maintained at, and by, the AITC. Data Collection Sets Data collection sets organize multiple data collection points into a single component that is used to review or log system performance. These data collection sets have the ability to be configured to generate alerts when a predefined threshold is reached, such as memory utilization or network throughput. System data collection consists of things such as: Disk Usage - this tracks the growth of database and log files and provides file-related statistics. Server Activity - this provides overview of server activity, resource utilization and contention. Query Statistics - this gathers data about query statistics and individual query text and plans. There is a dependency on the architecture that is put into place at, and by, the AITC, which further determines how these and other data collection sets may be collected. System Performance Report 4 Version 1.0

2.3 Traffic Models Traffic models can be implemented to capture the capabilities of the network load and produce predictions of system performance given certain factors. There is a dependency on the architecture that is put into place at, and by, the AITC, which determines the limits of the network load on the system. The installation of the equipment that supports this architecture is administered and maintained by VA. 3 System Performance Reporting 3.1 Performance Data Collecting There is a dependency on the architecture that is put into place at the AITC, which further determines how the data analysis process occurs. 3.2 Performance Data Analysis There is a dependency on the architecture that is put into place at the AITC, which further determines how the data analysis process occurs. Data has been collected and analyzed. It should meet the following goals: Real performance data on BMS Class 1 from the field. It is understood that the priority data is to support the contractual operational requirements, specifically those that are listed below: o o o o o 90% availability 2 second response time (as measured within the system itself) 7,700+ simultaneous user handling (during AM peak times), with equivalent simultaneous read/write/etc. transaction support 153 VAMC sites supported by the BMS application Capable of handling 616,000+ transactions per day 3.3 System Performance Report Form This section includes formal performance report forms and/or documentation with related test results. There are currently 31 VAMCs cut over to BMS Class I. System Performance Report 5 Version 1.0

Figure 2 - Maximum of 152 User Sessions on BMS Below are graphs illustrating BMS Class I system performance: NOTE: The system is capable of handling 616,000+ transactions per day. For the purposes of this document a defect is defined as follows: A defect with regard to the end user experience monitoring tool means any of the following: The transaction was slow, i.e. it broke the threshold that has been set. The transaction resulted in a client request error. The transaction resulted in a server response error. 3.3.1 Availability The availability dashboard shows synthetic testing results of the BMS system. Including calls to the frontend web application along with tests on availability of the backend services (WSDLs). Figure 3 presents the three core graphs indicating the availability of BMS from 8/25-9/24. System Performance Report 6 Version 1.0

Figure 3 - BMS Web Front Ends Availability Reports either a 1 or a 0. 1 = Successful call was made, 0 = synthetic test failed Response Code Represents the http response code returned from the synthetic test. A 200 is expected as an HTTP OK. Anything >/=500 is a server error returned from the synthetic test. Response Time Application response time shown in milliseconds. System Performance Report 7 Version 1.0

3.3.2 Response time The dashboards in the following figures show the 5 core metrics; Average Response Time, Responses Per Interval, Concurrent Invocations, Errors Per Interval and Stall Count, that are returned for application components from Application Performance Management/Monitoring (APM). Average Response time Response time averages for monitored components shown in milliseconds. Responses Per Interval Application response load for an application component. Shows the number of times components are completed in an interval. Concurrent Invocations Shows the number of concurrent components that are in flight during the interval. Errors Per Interval Shows response time errors per interval including application exceptions and components that take longer than 30 seconds to respond (stall). Stall Count Response time applications, during a reporting interval, that take longer than 30 seconds to respond. Figure 4 presents the five core graphs for the BMS Web Front Ends and is an indicator of Response Time performance. There are currently 31 VAMCs cutover. Figure 4 - BMS Web Front Ends System Performance Report 8 Version 1.0

Figure 5 presents the five core graphs for the BMS Web Database time and is an indicator of Response Time performance. Figure 5 - BMS Web Database System Performance Report 9 Version 1.0

Figure 6 presents the five core graphs for the BMS Web Client Web Services and is an indicator of Response Time performance. Figure 6 - BMS Web Client Web Services System Performance Report 10 Version 1.0

3.3.3 Simultaneous user handling Figure 7 presents the five core graphs for MDWS Frontends and is an indicator of user handling time performance. Figure 7 - MDWS Server Web Services System Performance Report 11 Version 1.0

Figure 8 presents the five core graphs for BMS Service Host Front Ends and is an indicator of user handling time performance. Figure 8 - BMS ServiceHost Server Web Services System Performance Report 12 Version 1.0

3.3.4 VAMC sites supported by the BMS application Figure 9 presents the five core graphs for BMS ServiceHost Web Services and is an indicator of the VAMC sites supported by the BMS application. Figure 9 - BMS ServiceHost Web Services System Performance Report 13 Version 1.0

Figure 10 presents the five core graphs for the WinServiceHost Front Ends and is an indicator of the VAMC sites supported by the BMS application. Figure 10 BMS ServiceHost Server Web Service System Performance Report 14 Version 1.0

3.3.5 Business Transaction Time Distribution Graph The information presented in the following figures details the BMS transaction time distribution during the month of September 2013. The transactions are categorized by Business Service type. This graph displays Median, Average, Specification, and Range for each Business Service. Figure 11 - Business Transaction Time Distribution Graph System Performance Report 15 Version 1.0

Figure 12 includes information on Total Transactions, Maximum, Minimum and Data Points for each Business Service. Figure 12 - Business Transaction Time Measurements Table System Performance Report 16 Version 1.0

Figure 13-includes information on Total Transactions, Maximum, Minimum, Data Points, Percentiles, Data Span and Averages for each Business Service. Figure 13- Business Transaction Time Measurements and Percentiles Table System Performance Report 17 Version 1.0

Figure 14 includes Percentiles, Data Span and Averages information for each Business Service. (Data unavailable for this period.) Figure 14 - Business Transaction Time Percentiles Table (Cont.) 3.3.6 Business Transaction Defect Pareto Graph The information presented below in the following figures details the defects experienced by the BMS program during the month of July 2013. Defects are categorized based on Business Transaction type. This graph displays the distribution of detected defects. (Data unavailable for this period.). Figure 15 - Business Transaction Defect Pareto Graph (Data unavailable for this period.) Figure 16 - Business Transaction Defect Measurements Table 4 Related Documentation Related or relevant documentation as applicable during the execution of the project: CLIN 0002AV; System Performance and Capacity Metrics Report; Init8_BMS_PCMetrics CLIN 0002AH; Hardware Specifications; Init8_BMS_HWSpec CLIN 0002AN; System Test Plan; Init8_BMS_MTestPlan CLIN 0002AP; Test Defect Report; Init8_BMS_DefectLog Baseline data from the current implementation of BMS @ AITC; as provided by VA System Performance Report 18 Version 1.0