ACC: Review of the Weekly Compensation Duration Model
|
|
- Brittany McCormick
- 5 years ago
- Views:
Transcription
1 ACC: Review of the Weekly Compensation Duration Model Prepared By: Knoware Creation Date July 2013 Version: 1.0 Author: M. Kelly Mara Knoware
2 Document Control Change Record Date Author Version Change Reference July 2013 Dr Kelly Mara V0.1 Initial draft Aug 2013 Dr Kelly Mara V1 Final following feedback and review. Reviewers Name Robert Noe Colin Harris Position Analytics Practice Lead Technical Director Distribution Copy No Name Location 1 Library Master Knoware Page 2 of 10
3 Table of Contents 1. Introduction Review of the Weekly Compensation Duration Model Scope of the Review What is not Covered Findings Methodology General Approach Methodology Documentation and Specific Sections in the Document Validation of the model the levels of accuracy - extent to which the model predicts WC outcomes Conclusions Recommendations Page 3 of 10
4 1. Introduction Review of the Weekly Compensation Duration Model This document covers a review of Weekly Compensation Duration Model - Modelling Approach which deals with the modelling approach taken to predicting the duration of weekly compensation claims at ACC. Scope of the Review Statistical appropriateness of the modelling approach The details of the presented model, including method of selection of predictors Documentation on modelling approach Validation of the model the levels of accuracy - extent to which the model predicts WC outcomes Recommendations. What is not Covered The data detail neither the data used to generate the model nor the data used for validation of prediction, except in relation to general documentation Any programming code used to construct or validate the model Operational implementation of the model. 2. Findings Methodology General Approach The purpose of the modelling is to predict the lifetime duration of Weekly Compensation claims registered with ACC. The modelling approach adopted is very appropriate for several reasons: Survival Analysis is a well-developed statistical field that has been applied to lifetime data in a wide variety of medical and engineering fields. Regression analysis may, initially, seem suitable. It is intuitively straightforward. Selection of a parametric model for WC lifetimes (for example using Exponential or Weibull distributions) requires considerable assumptions that would require checking and, given the large amount of censoring in WC durations that is involved in this data, it would be likely to lead to biased predictions. The Cox Proportional Hazard is the most commonly used regression model for survival data. This model makes only the assumption that different individuals (WC events) have hazard functions that are proportional to one another, dependent upon input factors (diagnosis, age, etc. for the WC situation). Page 4 of 10
5 Given the large amount of input variables (and, indeed, the very large amount of data that ACC has), the Proportional Hazards approach is the most understandable in any operational sense. Methodology Documentation and Specific Sections in the Document Section 3.1 Intro, p2 - Agreed standard regression techniques are likely to lead to biased results given the level of censoring for more recent claims. This is more relevant for 2013 claims. (Table 1 gives claims still open it is not surprising that 2013 is high.) Section 3.1 Items 1) and 2) - This is an operational argument, not a methodological one. Section 3.1 Item 3 - The assumptions have been shown to be broadly satisfied but the documentation shows a broad graphical compatibility. The proportional hazard assumption can be formally tested. This should be included in an appendix, not in the main body of the report. Section 3.1 Item 4 - Yes, the survival curve is, of itself, not of primary interest. Section 3.1 Item 5 - Using time variable covariates in the model is, in the longer term, possible, but requires testing. The idea has intuitive appeal, but it is not totally clear that the incapacity duration will have an effect how often does the incapacity duration change during the course of the claim? In Table 1 (p3), the historical still open claims are tabulated. Table 1 is relevant only for 2012 and 2013 since the model focusses on short term claims (less than 182 days). Section 5.1 (p6), the use of expert experience is a very good starting point. The model should start with the consensus primary factors. Which factors proved useful and which did not? A few are listed. Further detail on some aspects of the nature of the variables is given in the Appendix. The overall summary of results including a description of the whole model, with the listed factors, should be given in a table (in the Appendix). There is a statement that clients select provider on the basis of seriousness of injury. This seems plausible, but what is the evidence for this? Last paragraph there should be their. Section 6 immediately becomes 6.2?? Section 6.2 General Comments. The stated aim is to produce accuracy of prediction, rather than a model that is easy to interpret. This is fine, but for the purposes of further, future justification, the model should be amenable to some sort of conceptual interpretation. The influence of each variable explained to case management experts is an excellent start. Some interpretation should be possible for case managers so that they can have confidence in the output from the model. Removal of the 5% threshold for significance down to 0.01% - completely agreed. With so much data, even 0.01% may not be stringent enough. With data points, almost any predictor will be determined as significant at almost any level. The Forward Selection regression method is a very good start for adding/not adding variables but the capability of the overall model or the overall value of adding a Page 5 of 10
6 new variable to the model should be tested. [Likelihood Ratio testing??] (SAS may generate comparisons progressively). Conversion of categorical to numeric this is sensible maybe a couple of examples?? Section 6.3 The near completeness of the data for all variables is fortunate for the purposes of model fitting. Section 6.4 Item 1: What was the base data and were there any anomalies? Section 6.4 Item 2: Forward selection for addition of variables (or transformed variables). This is very appropriate. Section 6.4 Item 3: Use of proportional Hazards for other variables is appropriate. Section 6.4 Item 4b: Smaller providers are excluded from the generation of the model OK, but how well did the model predictions perform for these smaller providers? Were these providers used for the validation exercise? Section 6.4 Item 4c: Same as above- did the model perform well for the small occupation categories? The use of data binning is entirely appropriate given the very large number of diagnoses, occupations, providers it means that any interpretation of the prediction results is limited to the level of aggregation, but this may be all that is required. However, the maximum accuracy requirement may be compromised at the fine level. This also applies where the input variables are continuous (as per 6.6) Section 6.6 The steps taken here ensure that the parameter estimates are reasonably robust Figure 1 gives evidence for prediction accuracy (How is R2 computed?). Section 6.7 Determination of Optimum Complexity of Model Technically, assessment of model in full would require assessment of performance of the model for all survival times t. This relies upon evaluating the Relative Operating Characteristic (ROC) which, in its turn, requires a dichotomous output. Comparisons at the ACC critical, points (t = 28, 49, 70 and 183 days) is very sensible. Use of a 80/20% split of training vs validation data is appropriate. Some description of how the split was determined would be helpful, to exclude the possibility of any structural differences between the training/modelling data and the validation data. Immediate Issues not affecting the appropriateness of the model or its application, but which may require explanation. While the overall purpose of the modelling is to generate useable predictions, it is important to ensure that the predictions are robust to deviations from the core assumptions. The proportionality of hazards, for the input variables, is a requirement of the model. This is initially assessed visually in the Appendix. In most cases it is quite clear that there is no substantial deviation from the assumption of proportionality. However, it Page 6 of 10
7 is noted that, occasionally, proportionality is possibly violated on visual evidence. For example, Diagnosis Group, pp 20,21. The proportionality assumption should be tested (I assume that SAS does this either automatically or as a result of an option?? Schoenfeld residuals??) and any effect of this deviation on the model outcome should be noted. Although maximum accuracy of prediction is desired, part of analysis should cover residual analysis not as part of the core report, but as an appendix. Again, I would suppose that SAS can produce residual plots. The final model includes many variables it is not clear that some variables add much method of FS addition of variables adds when the significance is 0.01% seems conservative, but when the number of data points is so large, almost anything will come in as significant. I am generally a supporter of the principle of parsimony for modelling using as few predictors as necessary. There is always some trade-off between maximising predictive accuracy and minimising model complexity. Figure 2, p 16 chooses 121 degrees of freedom for maximum AUR but it is not clear that fewer variables would not give an approximate equivalence (it is not clear what the real difference between a model with 80df and one with 120df, at least in terms of predictive capability. I realise that the intent is to obtain maximum accuracy - so the prediction process becomes a black box but fewer predictors may give almost the same accuracy and still be open to more operational explanation. The computing capacity at ACC may allow for a much greater number of predictors to generate predictions but does the addition of the more marginal variables improve the overall fit? [Use Likelihood Ratio test for comparison of models??]. Granularity of Data The model uses the dependent variable as days on WC. When a claim is made, does the time on WC determined from days or weeks (days*7) since the critical points for measurement at ACC are 28, 49, 70 and 183 (4, 7, 10 and 26 weeks), does this affect the continuity of the data? The analysis approach is still highly appropriate, but the predictions have a higher level of variability than might be initially evident. [For example, is prediction of 13 days vs 14 days really a prediction of 1 week vs 2 weeks?] Validation of the model the levels of accuracy - extent to which the model predicts WC outcomes. Christchurch Pilot The second document tabulates the results of measuring the model predictions against actual outcomes for Christchurch WC claims since 27 August, There are 10 groups of claims, ordered by severity. It is not clear how the groups were determined. [By simple percentile rankings of actual durations? By some other index not declared here??] For claims in each group, the predictions from the model are compared against the actual outcomes. [At least on Lower Quartile; Median and Upper Quartile of the distribution of claim durations.] Page 7 of 10
8 The comparison is, generally very favourable. The model does function well on these measures. However, I would like to see actual distributions of predictions and actuals to determine any goodness of fit. The model does perform well for most claims (possibly excepting the longer, severe claims) but the three statistics used are fairly robust measures of a distribution. Is there any structural mismatch? [ie, an examination of residuals is important] Conclusions For predicting the WC outcome for individual cases, the modelling approach adopted is very appropriate The model uses well accepted statistical analysis methods The approach is to develop a black-box prediction process, this results in a large number of predictors in the model Reducing the total number of predictors used for the model may sacrifice little in terms of precision but may be very useful for future useability and acceptance Page 8 of 10
9 3. Recommendations 1. As part of any future examination conduct/present a summary Residual Analysis. There are several purposes behind a closer examination of errors or residuals, some of which have been commented on earlier. While these items do not affect the overall goal of maximum accuracy of prediction, analysis of residuals is important to offer insight into the functioning of the model. Discovering Best Functional Form the method used adopts a high accuracy approach (since computing ability is not a limitation) so every variable is in the predictive model -that is OK, but further analysis could reveal the best set of predictors. Identifying subjects not well predicted by the model the construction of the model excludes small categories (quite correctly) but how well does the prediction model perform for the small cases? This could well be part of the validation phase. Identifying Influential Points are there any points that have a strong effect on model parameters/have high influence on parameter estimates? Assessment of Proportional Hazards Assumption noted above. If the proportional hazard assumption is valid, the Schoenfeld residuals should be a random walk. Conversely, if some variable has a large positive effect early on but trails away ( beyond, say, some point once the claimant is cured, then the assumption of proportionality of hazard for that variable is not appropriate. 2. As part of any future work, examination of the level of contribution of predictor variables could be covered. This is with a view toward seeking an explanation and minimisation of predictors. Without exploring a considerable amount of formal statistical and analytical detail, some commentary is appropriate as an attached part of an appendix. While this will certainly be of little use to ACC Case Managers in the short term, it can provide a background for model refinement. I realise that the initial intent is to obtain maximum accuracy - so the prediction process becomes a black box but fewer predictors may give almost the same accuracy and still be open to more operational explanation. The computing capacity at ACC may allow for a much greater number of predictors to generate predictions, but does the addition of the more marginal variables improve the overall fit? The treatment may influence the outcome substantially early on, but after a while it has little or no effect on survival. Seeking a model that is amenable to explanation may more readily allow for interpretation and evaluation of the effect policy or operational changes on duration outcomes. It is important to remember that the present model provides a system- Page 9 of 10
10 wide prediction, not necessarily the outcome for a specific case. [The use of ranges Lower Quartile, Median, Upper Quartile is very beneficial, but understanding the drivers of the outcome is important to determine if, in future, which or any policy change are likely to achieve a change in outcome. If it does, then the model can be re-calibrated to accommodate the new policy. However, a great advantage of the present model, in the case of ACC with little limitation to computing ability, using more variables to maximise accuracy is appropriate as an initial project. Page 10 of 10
Harbingers of Failure: Online Appendix
Harbingers of Failure: Online Appendix Eric Anderson Northwestern University Kellogg School of Management Song Lin MIT Sloan School of Management Duncan Simester MIT Sloan School of Management Catherine
More informationExercise Confidence Intervals
Exercise Confidence Intervals (Fall 2015) Sources (adapted with permission)- T. P. Cronan, Jeff Mullins, Ron Freeze, and David E. Douglas Course and Classroom Notes Enterprise Systems, Sam M. Walton College
More informationUnderstanding UPP. Alternative to Market Definition, B.E. Journal of Theoretical Economics, forthcoming.
Understanding UPP Roy J. Epstein and Daniel L. Rubinfeld Published Version, B.E. Journal of Theoretical Economics: Policies and Perspectives, Volume 10, Issue 1, 2010 Introduction The standard economic
More informationUsing Excel s Analysis ToolPak Add-In
Using Excel s Analysis ToolPak Add-In Bijay Lal Pradhan, PhD Introduction I have a strong opinions that we can perform different quantitative analysis, including statistical analysis, in Excel. It is powerful,
More informationTelecommunications Churn Analysis Using Cox Regression
Telecommunications Churn Analysis Using Cox Regression Introduction As part of its efforts to increase customer loyalty and reduce churn, a telecommunications company is interested in modeling the "time
More informationRevision confidence limits for recent data on trend levels, trend growth rates and seasonally adjusted levels
W O R K I N G P A P E R S A N D S T U D I E S ISSN 1725-4825 Revision confidence limits for recent data on trend levels, trend growth rates and seasonally adjusted levels Conference on seasonality, seasonal
More information14 June Access arrangement information for the period 1 July 2017 to 30 June 2022
Attachment 13.1 Analytics + Data Science Report on Methodology for setting the service standard benchmarks and targets Revised proposed access arrangement information 14 June 2018 Access arrangement information
More informationBusiness Analytics & Data Mining Modeling Using R Dr. Gaurav Dixit Department of Management Studies Indian Institute of Technology, Roorkee
Business Analytics & Data Mining Modeling Using R Dr. Gaurav Dixit Department of Management Studies Indian Institute of Technology, Roorkee Lecture - 02 Data Mining Process Welcome to the lecture 2 of
More informationQ: Using at least 3 biological replicates in an experiment is recommended to do. What do you suggest: At which step of calculation of the relative
The questions below have been asked by attendees of the qpcr webinar series, Part 2: Analyzing Your Data. All the questions, including the questions that could not be answered during the webinar have been
More informationTIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica
TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS Liviu Lalescu, Costin Badica University of Craiova, Faculty of Control, Computers and Electronics Software Engineering Department, str.tehnicii, 5, Craiova,
More informationCREDIT RISK MODELLING Using SAS
Basic Modelling Concepts Advance Credit Risk Model Development Scorecard Model Development Credit Risk Regulatory Guidelines 70 HOURS Practical Learning Live Online Classroom Weekends DexLab Certified
More informationData Analysis and Sampling
Data Analysis and Sampling About This Course Course Description In order to perform successful internal audits, you must know how to reduce a large data set down to critical subsets based on risk or importance,
More informationGlossary of Standardized Testing Terms https://www.ets.org/understanding_testing/glossary/
Glossary of Standardized Testing Terms https://www.ets.org/understanding_testing/glossary/ a parameter In item response theory (IRT), the a parameter is a number that indicates the discrimination of a
More informationMethodological issues in the regulation of mobile voice call termination
Methodological issues in the regulation of mobile voice call termination A response to Oftel s Review of Mobile Voice Call Termination Markets July 003 Summary In May 003 Oftel published a consultation
More informationRELIABILITY ANALYSIS OF THERMAL POWER GENERATING UNITS BASED ON WORKING HOURS
Journal of Reliability and Statistical Studies; ISSN (Print): 0974-8024, (Online):2229-5666 Vol. 7, Issue 1 (2014): 113-123 RELIABILITY ANALYSIS OF THERMAL POWER GENERATING UNITS BASED ON WORKING HOURS
More informationCredit Card Marketing Classification Trees
Credit Card Marketing Classification Trees From Building Better Models With JMP Pro, Chapter 6, SAS Press (2015). Grayson, Gardner and Stephens. Used with permission. For additional information, see community.jmp.com/docs/doc-7562.
More informationHow much goes to the cause?
Research findings on public perceptions of not for profit costs October 2013 How much goes to the cause? The public, media and regulators often judge charity performance by how much goes to the cause.
More informationReply to the Referees John J. Seater 15 April 2008
Reply to the Referees John J. Seater 15 April 2008 I thank the two referees for their thoughtful comments. Both obviously read my paper carefully and have provided well-considered comments. Naturally,
More informationPerformance criteria
Performance criteria Q 7-03. What is the usefulness of performance criteria? With any complex and continuing problem such as state debt issuance, it is critical to employ performance criteria - set in
More informationMore Multiple Regression
More Multiple Regression Model Building The main difference between multiple and simple regression is that, now that we have so many predictors to deal with, the concept of "model building" must be considered
More informationAn evaluation of simple vs. complex selection rules for forecasting many time series. Fotios Petropoulos Prof. Robert Fildes
An evaluation of simple vs. complex selection rules for forecasting many time series Fotios Petropoulos Prof. Robert Fildes Outline Introduction Research Questions Experimental set-up: methods, data &
More informationSalford Predictive Modeler. Powerful machine learning software for developing predictive, descriptive, and analytical models.
Powerful machine learning software for developing predictive, descriptive, and analytical models. The Company Minitab helps companies and institutions to spot trends, solve problems and discover valuable
More informationComment on IAESB Exposure Draft IES 8, Professional Development for Engagement Partners Responsible for Audits of Financial Statements (Revised)
International Accounting Education Standards Board (IAESB) 529 5th Avenue, 6th Floor New York, New York 10017, USA Attn. IAESB Technical Manager Brussels, 11 December 2012 Dear Sir Comment on IAESB Exposure
More informationEuropean Association of Public Banks
To be sent by e-mail to: DP-2012-03@eba.europa.eu Reference: EBA/DP/2012/3 11 January 2013 EAPB opinion on the discussion paper "relating " to Draft Regulatory Technical Standards on prudent valuation
More informationYes, there is going to be some math (but not much) STATISTICAL APPROACH TO MEDICAL DEVICE VERIFICATION AND VALIDATION
Yes, there is going to be some math (but not much) STATISTICAL APPROACH TO MEDICAL DEVICE VERIFICATION AND VALIDATION Medical Device Verification and Validation In the medical device world, verification
More informationSAS BIG DATA ANALYTICS INCREASING YOUR COMPETITIVE EDGE
SAS BIG DATA ANALYTICS INCREASING YOUR COMPETITIVE EDGE SAS VISUAL ANALYTICS STATE OF THE ART SOLUTION FOR FASTER, SMARTER DECISIONS AIMED AT THE MASSES Data visualization Approachable analytics Robust
More informationChapter 19. Confidence Intervals for Proportions. Copyright 2012, 2008, 2005 Pearson Education, Inc.
Chapter 19 Confidence Intervals for Proportions Copyright 2012, 2008, 2005 Pearson Education, Inc. Standard Error Both of the sampling distributions we ve looked at are Normal. For proportions For means
More informationMining for Gold gets easier and a lot more fun! By Ken Deal
Mining for Gold gets easier and a lot more fun! By Ken Deal Marketing researchers develop and use scales routinely. It seems to be a fairly common procedure when analyzing survey data to assume that a
More informationINTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK
INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK Robert Rell February 29, 2012 Disclaimer: The views expressed do not necessarily reflect the views of the Federal Reserve
More informationLCA in decision making
LCA in decision making 1 (13) LCA in decision making An idea document CHAINET LCA in decision making 2 (13) Content 1 INTRODUCTION 2 EXAMPLE OF AN INDUSTRIAL DEVELOPMENT PROCESS 2.1 General about the industrial
More informationDistinguish between different types of numerical data and different data collection processes.
Level: Diploma in Business Learning Outcomes 1.1 1.3 Distinguish between different types of numerical data and different data collection processes. Introduce the course by defining statistics and explaining
More informationCounter activations in PICASSO. Overview of the options and their implications
Counter activations in PICASSO Overview of the options and their implications Contents 1 Introduction... 3 2 Options... 3 2.1 Option 1: No limitation of counter Activation... 4 2.2 Option 2: Complete avoidance
More informationSTATISTICAL TECHNIQUES. Data Analysis and Modelling
STATISTICAL TECHNIQUES Data Analysis and Modelling DATA ANALYSIS & MODELLING Data collection and presentation Many of us probably some of the methods involved in collecting raw data. Once the data has
More informationPrudent Valuation Crédit Agricole SA comments on EBA Discussion Paper
11/01/2013 Prudent Valuation Crédit Agricole SA comments on EBA Discussion Paper General remarks: 1) Fair Value v/s Prudent Value: As a very first remark, we would like to highlight that the introduction
More informationGETTING STARTED WITH PROC LOGISTIC
PAPER 255-25 GETTING STARTED WITH PROC LOGISTIC Andrew H. Karp Sierra Information Services, Inc. USA Introduction Logistic Regression is an increasingly popular analytic tool. Used to predict the probability
More informationAon Trustee Checklist
Aon Hewitt Investment Consulting Aon Trustee Checklist Investment decision making in defined benefit trustee meetings Risk. Reinsurance. Human Resources. The reason you need this checklist is because you
More informationHow to effectively commission research
How to effectively commission research Published by Audiences London 2010 This guide provides a checklist of things to think about, including the different research methods open to you, their uses and
More informationTickITplus Implementation Note
Title Understanding Base Practices Requirement Sizing Date April 2015 Reference TIN015-1504 Originator Dave Wynn Version v1r0 Key Terms Base Practices, Implementation, Requirements, Sizing, Estimating,
More informationAppendix of Sequential Search with Refinement: Model and Application with Click-stream Data
Appendix of Sequential Search with Refinement: Model and Application with Click-stream Data Yuxin Chen and Song Yao A1. Robustness Tests of An Alternative Information Structure One assumption we have pertaining
More informationGetting Started with HLM 5. For Windows
For Windows Updated: August 2012 Table of Contents Section 1: Overview... 3 1.1 About this Document... 3 1.2 Introduction to HLM... 3 1.3 Accessing HLM... 3 1.4 Getting Help with HLM... 3 Section 2: Accessing
More informationENGG1811: Data Analysis using Excel 1
ENGG1811 Computing for Engineers Data Analysis using Excel (weeks 2 and 3) Data Analysis Histogram Descriptive Statistics Correlation Solving Equations Matrix Calculations Finding Optimum Solutions Financial
More informationMARK SCHEME for the May/June 2010 question paper for the guidance of teachers 9772 ECONOMICS. 9772/02 Paper 2 (Essays), maximum raw mark 75
UNIVERSITY OF CAMBRIDGE INTERNATIONAL EXAMINATIONS Pre-U Certificate www.xtremepapers.com MARK SCHEME for the May/June 2010 question paper for the guidance of teachers 9772 ECONOMICS 9772/02 Paper 2 (Essays),
More informationKristin Gustavson * and Ingrid Borren
Gustavson and Borren BMC Medical Research Methodology 2014, 14:133 RESEARCH ARTICLE Open Access Bias in the study of prediction of change: a Monte Carlo simulation study of the effects of selective attrition
More informationImplementing EC Regulation 1071/2009 Rules Concerning the Occupation of Road Transport Operator
Implementing EC Regulation 1071/2009 Rules Concerning the Occupation of Road Transport Operator This is the response of the Royal Society for the Prevention of Accidents (RoSPA) to the Department for Transport
More informationTest Design, Statistical Analyses in TEMPs, Test Plans, and DOT&E Reports
Test Design, Statistical Analyses in TEMPs, Test Plans, and DOT&E Reports Introduction and Course Overview 5/20/2015-1 Welcome and Overview Administrative Details Introductions Test Planning Introduction
More informationGETTING STARTED WITH PROC LOGISTIC
GETTING STARTED WITH PROC LOGISTIC Andrew H. Karp Sierra Information Services and University of California, Berkeley Extension Division Introduction Logistic Regression is an increasingly popular analytic
More information48 Market Rate Analysis
783 48 Market Rate Analysis Key concepts and terms Arithmetic mean or average Capsule job description Inter-quartile range Lower quartile Market rate survey Median Benchmark jobs Derived market rate Job
More informationYour reputation is on the line, your clients deserve the best, so use the best industry comparison benchmarking data available.
I want to discuss how you can make your industry comparison benchmarking more transparent, useful, and credible. Benchmarking for valuation should produce results that clearly determine a company s strength
More informationDraft amended MRCP Market Procedure
Corey Dykstra 1 Changes to methodology 2 Step change in 3 Terms of Reference of WG 4 Need for further consultation Having been involved in the WG, I am generally comfortable with the changes to the methodology
More informationDiscussion Solution Mollusks and Litter Decomposition
Discussion Solution Mollusks and Litter Decomposition. Is the rate of litter decomposition affected by the presence of mollusks? 2. Does the effect of mollusks on litter decomposition differ among the
More informationRecent research, as indicated. Leveraging Customer
Leveraging Customer The Hidden Gold in Recent research, as indicated in figure 1, suggests that as many as 98% of organizations collect customer feedback, largely in the form of surveys, yet far fewer
More informationASSIGNMENT SUBMISSION FORM
ASSIGNMENT SUBMISSION FORM Treat this as the first page of your assignment Course Name: Assignment Title: Business Analytics using Data Mining Crowdanalytix - Predicting Churn/Non-Churn Status of a Consumer
More informationUnit QUAN Session 6. Introduction to Acceptance Sampling
Unit QUAN Session 6 Introduction to Acceptance Sampling MSc Strategic Quality Management Quantitative methods - Unit QUAN INTRODUCTION TO ACCEPTANCE SAMPLING Aims of Session To introduce the basic statistical
More informationThe Application of Survival Analysis to Customer-Centric Forecasting
The Application of Survival Analysis to Customer-Centric Forecasting Michael J. A. Berry, Data Miners, Inc., Cambridge, MA ABSTRACT Survival analysis, also called time-to-event analysis, is an underutilized
More informationInternal Model Validation. Market Workshop 22/05/2017
Internal Model Validation Market Workshop 22/05/2017 Contents Overview Validation Review Themes PRA Feedback 2017 Timetable Targeted Validation Plan Questions 2 Overview Validation report reviews - a snapshot
More informationTechniques for Estimating Fixed and Variable Costs
C h a p t e r 4 Techniques for Estimating Fixed and Variable Costs YOU NOW UNDERSTAND THAT MOST accounting systems, such as the one at Hercules, are set up to comply with Generally Accepted Accounting
More informationUnderstanding and Managing Uncertainty in Schedules
Understanding and Managing Uncertainty in Schedules Realistic Plans for Project Success Presented by: John Owen MPUG Project Integration Month July 20 th, 2016 John Owen 2 1981-1985 Worley Engineering
More informationAnalysis of Factors Affecting Resignations of University Employees
Analysis of Factors Affecting Resignations of University Employees An exploratory study was conducted to identify factors influencing voluntary resignations at a large research university over the past
More informationDraft agreed by Scientific Advice Working Party 5 September Adopted by CHMP for release for consultation 19 September
23 January 2014 EMA/CHMP/SAWP/757052/2013 Committee for Medicinal Products for Human Use (CHMP) Qualification Opinion of MCP-Mod as an efficient statistical methodology for model-based design and analysis
More informationMore is Better? Analyzing the Relation between Metering Duration and Accuracy of Results for Residential Lighting Evaluations
More is Better? Analyzing the Relation between Metering Duration and Accuracy of Results for Residential Lighting Evaluations Joel D. Pertzsch, Michaels Energy Inc., La Crosse, WI Ryan M. Kroll, Michaels
More informationSYSTEMKARAN ADVISER & INFORMATION CENTER QUALITY MANAGEMENT SYSTEM ISO9001:
SYSTEM KARAN ADVISER & INFORMATION CENTER QUALITY MANAGEMENT SYSTEM ISO9001:2015 WWW.SYSTEMKARAN.ORG 1 WWW.SYSTEMKARAN.ORG Foreword... 5 Introduction... 6 0.1 General... 6 0.2 Quality management principles...
More informationAssessing the wider benefits arising from university-based research: Discussion paper response template Introduction
Assessing the wider benefits arising from university-based research: Discussion paper response template Introduction Please provide any comments you have in relation to the issues raised in Part 1 of the
More informationThe SPSS Sample Problem To demonstrate these concepts, we will work the sample problem for logistic regression in SPSS Professional Statistics 7.5, pa
The SPSS Sample Problem To demonstrate these concepts, we will work the sample problem for logistic regression in SPSS Professional Statistics 7.5, pages 37-64. The description of the problem can be found
More informationWalking on Eggshells. Effective Management of Internal Pay Equity. CUPA-HR New York Chapter
CUPA-HR New York Chapter Walking on Eggshells Effective Management of Internal Pay Equity Moshe Mayefsky Senior Consultant Megan Werner Associate Consultant Copyright 2018 by The Segal Group, Inc. All
More informationApplying Regression Techniques For Predictive Analytics Paviya George Chemparathy
Applying Regression Techniques For Predictive Analytics Paviya George Chemparathy AGENDA 1. Introduction 2. Use Cases 3. Popular Algorithms 4. Typical Approach 5. Case Study 2016 SAPIENT GLOBAL MARKETS
More informationChapter 4: Foundations for inference. OpenIntro Statistics, 2nd Edition
Chapter 4: Foundations for inference OpenIntro Statistics, 2nd Edition Variability in estimates 1 Variability in estimates Application exercise Sampling distributions - via CLT 2 Confidence intervals 3
More informationENVIRONMENTAL FINANCE CENTER AT THE UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL SCHOOL OF GOVERNMENT REPORT 3
ENVIRONMENTAL FINANCE CENTER AT THE UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL SCHOOL OF GOVERNMENT REPORT 3 Using a Statistical Sampling Approach to Wastewater Needs Surveys March 2017 Report to the
More informationGroundwater Monitoring Statistical Methods Certification
Groundwater Monitoring Statistical Methods Certification WEC Temporary Ash Disposal Area Whelan Energy Center Public Power Generation Agency/ Hastings Utilities February 9, 2018 This page intentionally
More informationREVIEW OF POWER SYSTEM EXPANSION PLANNING IN VIETNAM
Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized REVIEW OF POWER SYSTEM EXPANSION PLANNING IN VIETNAM Tasks 1 and 2 Report Prepared for
More information36.2. Exploring Data. Introduction. Prerequisites. Learning Outcomes
Exploring Data 6. Introduction Techniques for exploring data to enable valid conclusions to be drawn are described in this Section. The diagrammatic methods of stem-and-leaf and box-and-whisker are given
More informationResponse to CESR s consultation on Inducements under MIFID (06-687)
International Swaps and Derivatives Association (ISDA) International Capital Market Association (ICMA) Asociación de Mercados Financieros (AMF) Association of Private Client Investment Managers and Stockbrokers
More informationProcurement Process: Architects & Professional Engineering Services. Procurement.
Procurement Process: Architects & Professional Engineering Services Procurement www.novascotia.ca/tenders Updated: February 2017 Table of Contents Contents INTRODUCTION:... 3 APPLICATION:... 3 SCOPE of
More informationTassc:Estimator technical briefing
Tassc:Estimator technical briefing Gillian Adens Tassc Limited www.tassc-solutions.com First Published: November 2002 Last Updated: April 2004 Tassc:Estimator arrives ready loaded with metric data to assist
More informationAdvanced Tutorials. SESUG '95 Proceedings GETTING STARTED WITH PROC LOGISTIC
GETTING STARTED WITH PROC LOGISTIC Andrew H. Karp Sierra Information Services and University of California, Berkeley Extension Division Introduction Logistic Regression is an increasingly popular analytic
More informationThe statistics used in this report have been compiled before the completion of any Post Results Services.
Course Report 2016 Subject Level Physics Advanced Higher The statistics used in this report have been compiled before the completion of any Post Results Services. This report provides information on the
More informationASSESSING THE TRADEOFF BETWEEN COST AND AVAILABILITY USING SIMULATION
2017 NDIA GROUND VEHICLE SYSTEMS ENGINEERING AND TECHNOLOGY SYMPOSIUM SYSTEMS ENGINEERING (SE) TECHNICAL SESSION AUGUST 8-10, 2017 NOVI, MICHIGAN ASSESSING THE TRADEOFF BETWEEN COST AND AVAILABILITY USING
More informationSawtooth Software. Sample Size Issues for Conjoint Analysis Studies RESEARCH PAPER SERIES. Bryan Orme, Sawtooth Software, Inc.
Sawtooth Software RESEARCH PAPER SERIES Sample Size Issues for Conjoint Analysis Studies Bryan Orme, Sawtooth Software, Inc. 1998 Copyright 1998-2001, Sawtooth Software, Inc. 530 W. Fir St. Sequim, WA
More informationBehaviour Driven Development
Behaviour Driven Development zero known defect software releases Challenging the assumption that good enough is really good enough Behaviour Driven Development (BDD) is an Agile methodology which improves
More informationGEARING FACTORS. The A FLEXIBLE SIZING APPROACH
GEARING FACTORS The A FLEXIBLE SIZING APPROACH MB Duration (Months) DERIVING GEARING FACTORS Determining the scope of a proposed system is one of the most challenging aspects of any software estimate.
More information32 BETTER SOFTWARE JULY/AUGUST 2009
32 BETTER SOFTWARE JULY/AUGUST 2009 www.stickyminds.com Why automate? This seems such an easy question to answer; yet many people don t achieve the success they hoped for. If you are aiming in the wrong
More informationShelf Life Determination: The PQRI Stability Shelf Life Working Group Initiative
Shelf Life Determination: The PQRI Stability Shelf Life Working Group Initiative Understanding Shelf Life What is the true (but unknown) shelf life of a product? The period of time during which a pharmaceutical
More informationChapter 5 RESULTS AND DISCUSSION
Chapter 5 RESULTS AND DISCUSSION 5.0 Introduction This chapter outlines the results of the data analysis and discussion from the questionnaire survey. The detailed results are described in the following
More informationTesting. CxOne Standard
Testing CxOne Standard CxStand_Testing.doc November 3, 2002 Advancing the Art and Science of Commercial Software Engineering Contents 1 INTRODUCTION... 1 1.1 OVERVIEW... 1 1.2 GOALS... 1 1.3 BACKGROUND...
More informationTransactions, American Geophysical Union Volume 35, Number 4 August 1954
Transactions, American Geophysical Union Volume 35, Number 4 August 1954 A METHOD FOR DETERMINING THE MINIMUM DURATION OF WATERSHED EXPERIMENTS Jacob L. Kovner and Thomas C. Evans Abstract--A simple graphic
More informationDescribing DSTs Analytics techniques
Describing DSTs Analytics techniques This document presents more detailed notes on the DST process and Analytics techniques 23/03/2015 1 SEAMS Copyright The contents of this document are subject to copyright
More informationChapter 4. Phase Four: Evaluating Jobs. Key points made in this chapter
C H A P T E R F O U R Chapter 4 Phase Four: Evaluating Jobs Key points made in this chapter The evaluation process: should be done on a factor-by-factor basis rather than job-by-job should include a sore-thumbing
More informationA PRACTICAL GUIDE FOR HOW AN ADVERTISER CAN PREPARE FOR GDPR JANUARY 2018
A PRACTICAL GUIDE FOR HOW AN ADVERTISER CAN PREPARE FOR GDPR JANUARY 2018 1 PURPOSE OF THIS DOCUMENT 2 This document is to be used as a guide for advertisers on how they should work with their agencies,
More informationAdopting Site Quality Management to Optimize Risk-Based Monitoring
WHITE PAPER Adopting Site Quality Management to Optimize Risk-Based Monitoring In today s pressure-packed environment, the quest for improved data quality at a lower cost is of paramount importance to
More informationEVALUATION FOR STABILITY DATA
INTERNATIONAL CONFERENCE ON HARMONISATION OF TECHNICAL REQUIREMENTS FOR REGISTRATION OF PHARMACEUTICALS FOR HUMAN USE ICH HARMONISED TRIPARTITE GUIDELINE EVALUATION FOR STABILITY DATA Q1E Recommended for
More informationTiming Production Runs
Class 7 Categorical Factors with Two or More Levels 189 Timing Production Runs ProdTime.jmp An analysis has shown that the time required in minutes to complete a production run increases with the number
More informationDallas J. Elgin, Ph.D. IMPAQ International Randi Walters, Ph.D. Casey Family Programs APPAM Fall Research Conference
Utilizing Predictive Modeling to Improve Policy through Improved Targeting of Agency Resources: A Case Study on Placement Instability among Foster Children Dallas J. Elgin, Ph.D. IMPAQ International Randi
More informationSecondary Math Margin of Error
Secondary Math 3 1-4 Margin of Error What you will learn: How to use data from a sample survey to estimate a population mean or proportion. How to develop a margin of error through the use of simulation
More informationAP Statistics Scope & Sequence
AP Statistics Scope & Sequence Grading Period Unit Title Learning Targets Throughout the School Year First Grading Period *Apply mathematics to problems in everyday life *Use a problem-solving model that
More informationMath227 Sample Final 3
Math227 Sample Final 3 You may use TI calculator for this test. However, you must show all details for hypothesis testing. For confidence interval, you must show the critical value and the margin of error.
More informationAttachment 1. Categorical Summary of BMP Performance Data for Solids (TSS, TDS, and Turbidity) Contained in the International Stormwater BMP Database
Attachment 1 Categorical Summary of BMP Performance Data for Solids (TSS, TDS, and Turbidity) Contained in the International Stormwater BMP Database Prepared by Geosyntec Consultants, Inc. Wright Water
More information2018 ISO Tariff Application Appendix F Point of Delivery ( POD ) Cost Function Report
2018 ISO Tariff Application Appendix F Point of Delivery ( POD ) Cost Function Report Date: Prepared by: Alberta Electric System Operator Prepared for: Alberta Utilities Commission Classification: Table
More informationBeyond balanced growth: The effect of human capital on economic growth reconsidered
Beyond balanced growth 11 PartA Beyond balanced growth: The effect of human capital on economic growth reconsidered Uwe Sunde and Thomas Vischer Abstract: Human capital plays a central role in theoretical
More informationWork Management System (WMS)
Comprehensive Tracking System CTS User Manual Work Management System (WMS) Open Range Software, LLC Last updated: 02-12-2016 Contents Chapter 1: Overview... 3 CTS Work Management System (WMS)... 3 Chapter
More informationCOMMENTARY ON PROPOSALS REGARDING SYSTEMIC COMPENSATION DISCRIMINATION
COMMENTARY ON PROPOSALS REGARDING SYSTEMIC COMPENSATION DISCRIMINATION The commentary below is submitted by William Osterndorf of HR Analytical Services. HR Analytical Services is a Milwaukee, Wisconsin
More informationAn Innovative System of Disaggregate Models for Trip Generation and Trip Attributes Using Random Forest Concept
An Innovative System of Disaggregate Models for Trip Generation and Trip Attributes Using Random Forest Concept Milad Ghasri, Taha Hossein Rashidi Abstract This research attempts to address the gap between
More information