Hogan Research Methodology

Similar documents
Three Research Approaches to Aligning Hogan Scales With Competencies

Glossary of Terms Ability Accommodation Adjusted validity/reliability coefficient Alternate forms Analysis of work Assessment Band Battery

WHITEPAPER. Synthetic Validity

Can Synthetic Validity Methods Achieve Discriminant Validity?

Occupational Solution

The process of making an informed hiring decision depends largely on two basic principles of selection.

Business Outcome Highlights

Creating Test Score Bands for Assessments Involving Ratings using a Generalizability Theory Approach to Reliability Estimation

Training and Development. Student Workbook. New Kid on the Block: Diagnosing Organizational Development Issues Using Data. By Steve Weingarden, Ph.D.

Conducting Organizational Strategic Planning for Continuous Improvement and Accountability: A Primer for IR ANNUAL TAIR CONFERENCE

CASE STUDY. Incremental Validity of the Wonderlic Motivation Potential Assessment (MPA)

Occupational Solution

Chapter 1: Understanding Educational and Career Transitions: A Brief Review of Implications from Career Theories Professor Leung Seung Ming Alvin

Understanding the Relationship Between Critical Thinking and Job Performance. By Kingsley C. Ejiogu, Zhiming Yang, John Trent & Mark Rose

whitepaper Predicting Talent Management Indices Using the 16 Primary Personality Factors

BHM347 ASSESSMENT, PERFORMANCE AND REWARD

17 The Essence of Organizational Behaviour

I-O Defined 3/4/ American Time Use Survey

Developing Station Managers with Personality and Leadership Assessments

Nowack, K. (2006). Emotional Intelligence: Leaders Make a Difference. HR Trends, 17, 40-42

A New Approach to Assessing Test Bias

Report 09 Assessing the Reliability of Test CDs during Development

HUMAN RESOURCES MANAGEMENT

Center for Effective Organizations

PSYC C performance appraisal 11/27/11 [Arthur] 1

Debunking the Myths of Performance Management

EMT Associates, Inc. Approach to Conducting Evaluation Projects

3E5: HUMAN RESOURCE MANAGEMENT

APS Hogan Report Guide all you need to know about selecting the right report

APS Hogan Report Guide all you need to know about selecting the right report

STEVEN HALL & PARTNERS

DO POPULAR PEOPLE PERFORM?

organizational citizenship behavior OCB

The Concept of Organizational Citizenship Walter C. Borman

Five Steps to a Better High Potential Program

Assessment Centers in the Public Sector: Best Practices and Lessons Learned Jennifer Geimer, Ph.D. Suzanne Tsacoumis, Ph.D.

+ Purpose of assessment and use in candidate monitoring or decisions are consequential

Presented by Anne Buckett, Precision HR, South Africa

Thinking Ahead: Assuming Linear Versus Nonlinear Personality-Criterion Relationships in Personnel Selection

U.S. ARMY FUTURE FORCE SELECTION AND CLASSIFICATION RESEARCH

3. Theoretical Background of Competency based Recruitment and Selection 1

2016 Technical Assistance Conference

Subject Description Form

PASSPOINT SETTING FOR MULTIPLE CHOICE EXAMINATIONS

Kohlberg s Dilemmas. How to Align the Internal-Structural Definition of Moral Competence With its Measurement. Georg Lind

Research Note. Community/Agency Trust: A Measurement Instrument

Managerial Performance Across the Hierarchy. Robert Hogan and Brent Holland. Hogan Assessment Systems

Other Selection Procedures

ABC Pre-Employment Assessments

Debunking the Myths of Performance Appraisal/Management. C. Allen Gorman, PhD

MANAGERIAL SUCCESS STUDY. Documenting the Relationship Between Versatility and Job Performance

Development of a Modified Improved Point Method Experience Questionnaire

WHITEPAPER. The Development of the Hogan Competency Model

Copyright subsists in all papers and content posted on this site.

A study on the relationship of contact service employee s attitude and emotional intelligence to coping strategy and service performance

U N D E R S T A N D I N G A N D U S I N G T H E P R O F I L E X T

The Effects of Workplace Spirituality and Work Satisfaction on Intention to Leave

Designing a framework for Sudanese Construction Industry

emergent group-level effects U.S. Army Medical Research Unit-Europe Walter Reed Army Institute of Research

Chapter 2 - Project Selection and Prioritization

Organizational Effectiveness

Article Review: Personality assessment in organisational settings

City of Cleveland: Public Safety Testing RFP Q&A February 25, 2016

Sales Selector Technical Report 2017

CAREER TRACKS JOB STRUCTURE: CATEGORIES AND LEVELS

At the onset of the 21st century, industrial organizational

Performance Appraisals - A Key to Organizational Effectiveness. C. David Crouch. Western Carolina University, Cullowhee

Developing Cultures for Safety & Reliability in Manufacturing Organizations. By: Hank Sarkis, MBA The Reliability Group, Lighthouse Point, Florida

Intercultural Development Inventory (IDI): Independent Review

Pay Policy Report IV.2. EVERGREEN SOLUTIONS, LLC INTRODUCTION SUMMARY OF POLICIES. a. Exempt. b. Classified

The Examination for Professional Practice in Psychology: The Enhanced EPPP Frequently Asked Questions

Organizational Strategy and Staffing

5 CFR AND DODI V431 EXCERPTS. (Lesson 6 Evaluating Performance)

The impact of informal social support on individual morale, distress, and satisfaction

Issues In Validity Generalization The Criterion Problem

JOB CLASSIFICATION: A BENEFIT TO ASSESSMENT DEVELOPMENT

British Psychological Society Psychological Testing Centre Test Reviews. Hogan's Motives, Values, Preferences Inventory

The Orion Pre-Employment Assessment Program

Strategic Succession Planning UNITED STATES OFFICE OF PERSONNEL MANAGEMENT

The Importance of Structured Onboarding. Shara Irene Staine. Colorado State University

Work Team Development. Chapter 13

SHRM Research: Balancing Rigor and Relevance

Fire Service Written Tests

Validity of Selection Procedures. By: Maria Ananchenko, Sarah Bassett, Veronica Carrillo, Joyce Reyes, Julianne Storck

Influence of Leadership Competencies on Extension Workers Performance in Yemen

Business Outcome Highlights

Analysis of Adverse Impact for the Hogan Personality Inventory. Documentation of Psychometric and Research Evidence

Georgia Department of Education Principal Induction Guidance. Introduction

Building Competency Models: Approaches for HR Professionals

Running head: LEARNING TRANSFER SYSTEM AND INDIVIDUAL PERFORMANCE 1. Research Roundtable ~Sample Paper~

Multitasking: Do preference and ability interact to predict performance at work?

The secret to reducing hiring mistakes?

Overview. Approaches to Addressing Adverse Impact: Opportunities, Facades, and Pitfalls. What is Adverse Impact? The d-statistic

and high technology companies and in nonprofit organizations. Teams are how work gets done.

Production practices and competitiveness: empirical model of two stage least square

Harrison Assessments Validation Overview

predictor of job performance a powerful INTRODUCTION

The Effect of Financial and Nonfinancial Goals on Performance: An Empirical Application of Humane Entrepreneurship theory in Family Firms Context

Hiring is Testing. (Even If You re Not Calling It a Test ) Geoff Burcaw. Hilary Ricardo. Senior Consultant. Senior Consultant

Competency Assessment and Handling Post-Assessment Centre Data

Transcription:

TECHNICAL BRIEF Hogan Research Methodology Chapter 4: Criterion-Related Validity Evidence Concurrent Criterion Related Validity Study Predictive Criterion Related Validity Study This technical brief is an excerpt of the Criterion-Related Validity chapter from the Hogan Research Methodology (HRM) addendum for all technical documentation. 1

A4. CRITERION-RELATED VALIDITY EVIDENCE Aguinis, Henle, and Ostroff (2001) described criterion-related validity in terms of the relationship between the predictor (e.g., HPI Scales) and some criterion measure (e.g., job performance), with the goal of answering the basic question: how accurate are test scores in predicting criterion performance? The Uniform Guidelines state evidence of the validity of a test or other selection procedure by a criterion-related validity study should consist of empirical data demonstrating that the selection procedure is predictive of or significantly correlated with important elements of job performance (29 C.F.R. 1607.5 (B)). Although there are many organizational and logistical constraints that limit the usefulness of criterion-related validity studies (McPhail, 2007), the Uniform Guidelines and Principles suggest considering this approach when a) there is an adequate, representative sample of job incumbents willing to participate, and b) development of reliable, unbiased measures of job performance is possible. The Principles also recommends using a relevant criterion measure, one that reflects the relative standing of employees with respect to important work behavior(s) or outcome measures(s) (p. 14). Additional factors should be taken into account and used as a guide when determining whether a criterion-related validity study is appropriate to use in any given selection situation. First, practitioners should consider the design when planning the study. A predictive design predicts scores on a criterion measured at some future occurrence. For example, job applicants complete the assessment before being hired and provide measures of performance after being on the job for some time. Concurrent designs are more practical because they do not require a time delay; instead, the organization collects job performance information at the same time job incumbents complete the assessment battery. Only one empirical study has examined the effects of these two strategies on criterion-related validity using personality measures. Van Iddekinge and Ployhart s (2008) review of criterion study design revealed that predictive designs produce slightly lower validity estimates than concurrent designs. Yet regardless of the strategy employed, the predictive value of the assessment is established by correlating assessment scores and job performance data, and other factors beyond study design may still influence this validity coefficient. For example, the Principles note that this observed validity coefficient may underestimate the predictor-criterion relationship due to the effects of range restriction and unreliability in the predictors and/or criterion. As a result, adjustments are available to account for these artificial reductions in variance. For instance, researchers often correct for criterion unreliability to estimate operational validity (Van Iddekinge & Ployhart, 2008). Note that Hogan corrects for measurement error, criterion unreliability, and range restriction where appropriate and reports both the observed and corrected validity coefficients in our technical documentation. Note that we do not correct correlation coefficients for predictor unreliability to estimate validity at the construct level Another decision researchers face is whether to use a single criterion or multiple criteria during the data collection phase of the criterion study. The literature recommends that researchers develop criterion measures that are conceptually aligned with the latent criterion constructs and that maximize the potential use of multiple criteria for predictor validation 2

(Van Iddekinge & Ployhart, 2008, p. 906). Furthermore, J. Hogan and Holland (2003) provide strong support for using specific criteria to estimate the validity of specific predictors in operational use. Although support for using narrow criteria is growing, collecting overall performance composites still provide the best approach to estimating validity of global predictors (Guion, 1961) and prediction improves when criterion ratings cover the full spectrum of effective performance (Oh & Berry, 2009). Therefore, the use of global criteria in the design of performance rating forms is still appropriate; however, specific dimensions should also be used when circumstances allow for them. Adhering to these guidelines, Hogan conducts criterion-related validity studies when an organization can (a) identify enough incumbents (e.g., generally greater than 100) to take the assessments and (b) identify supervisors to evaluate each incumbent using a performance rating form developed by Hogan. Many organizations choose not to conduct a local validation when they either do not have enough incumbents in a role or are concerned about the time commitment involved for their incumbents and their supervisors. In such cases, Hogan relies solely on VG evidence. A4.1. Concurrent Criterion Related Validity Study Hogan conducts concurrent criterion-related validity studies using a three-step process: (1) collecting Hogan assessment data, (2) collecting job performance data (i.e., supervisor and objective performance ratings), and (3) conducting analyses examining the relationships between the assessment and performance data. In step one, the organization and Hogan work together to identify incumbents to participate in the research. Ideally, this sample should include at least 100 incumbents where we obtain matched predictor and criterion data. Additionally, this sample should represent the incumbent population in terms of performance, demographics, locations, work environments, and other contextual factors that may influence job performance. Last, incumbents should have enough tenure in the position to allow for reliable performance ratings by a supervisor. Identified incumbents then complete the relevant Hogan assessments. In step two, Hogan uses information gathered from the job analysis and stakeholder conversations to develop a Performance Rating Form (PRF). This PRF usually includes items to assess overall performance, critical competencies, and other relevant behaviors. Incumbents supervisors often participate in rater training (i.e., to reduce bias rating errors) prior to completing the PRF. This process may also include collecting objective performance metrics (e.g., sales, turnover, business unit performance) from the organization to help supplement the data available for step three. In the final step, Hogan analyzes the incumbent assessment and performance data to identify the assessments and scales that are the strongest predictors of performance in the job. Hogan uses this information to create a multi-scale profile of successful performance on the job. The client may then implement this profile to aid in their candidate selection process. 3

A4.2. Predictive Criterion Related Validity Study Predictive criterion related validity studies closely resemble concurrent criterion-related validity studies with one notable exception: the sample. The sample for predictive criterion related validity studies consists of job applicants, as opposed to job incumbents (Anastasi & Urbina, 1997). As such, the process for conducting a predictive study varies slightly from a concurrent study. The first step in a predictive study involves administering the measure to job applicants. Candidates are then selected for the role without considering results from the administered measure. At a later date, Hogan collects performance data based on job analysis information for the hired applicants. Hogan analyzes the relationship between the administered scales and job performance to identify relevant scales for predicting job performance. Given practical and operational considerations, companies rarely have the opportunity to conduct this type of study and often rely on Hogan to use a concurrent design (as discussed in section A4.1). 4

REFERENCES Aguinis, H., Henle, C. A., & Ostroff, C. (2001). Measurement in work and organizational psychology. In N. Anderson, D. S. Ones, H. K. Sinangil, & C. Viswesvaran (Eds.), Handbook of industrial, work and organizational psychology (Vol. 1, pp. 27 50). London: Sage Anastasi, A., & Urbina, S. (1997). Psychological Testing (7th ed.). Upper Saddle River, NJ: Prentice-Hall, Inc. Guion, R. M. (1961). Criterion measurement and personnel judgments. Personnel Psychology, 14, 141-149. Hogan, J., & Holland, B. (2003). Using theory to evaluate personality and job-performance relations: A socioanalytic perspective. Journal of Applied Psychology, 88, 100-112. McPhail, S. M. (2007). Alternative validation strategies. San Francisco, CA: Jossey-Bass. Oh, I., & Berry, C. M. (2009). The Five-Factor model of personality and managerial performance: Validity gains through the use of 360 degree performance ratings. Journal of Applied Psychology, 94, 1498-1513. Van Iddekinge, C. H., & Ployhart, R. E. (2008). Developments in the criterion-related validation of selection procedures: A critical review and recommendations for practice. Personnel Psychology, 61, 871-925. 5