Reevaluating the Decision Making Dimension:

Size: px
Start display at page:

Download "Reevaluating the Decision Making Dimension:"

Transcription

1 Reevaluating the Decision Making Dimension: Disentangling the Role of Cognitive Ability Brett W. Guidry, M.S. Doctoral Candidate Deborah Rupp, PhD. Professor and William C. Byham Chair in I/O Psychology Purdue University

2 Overview 1. Overview of my perspective on the AC method 2. Why I m picking on decision making 3. Decision making: Dual system theory 4. Getting into the Black Box of decision making 5. What we are doing 6. What we have found so far

3 Why I Like Assessment Centers Assessment is an essential part of every project Did the project work? Did it work in the way we intended? Did each component of the project function efficiently? Did the project members perform well? Without a reliable means of assessment, none of these questions can be answered.

4 Why I Like Assessment Centers If I am going to be assessed Assessors are trained to assess, managers are trained to manage (I think this is actually a fairly serious problem see Longnecker, 1987) Trained against biases (e.g., halo effect) Specific Tools Other assessors rating the same thing Audio and video logs Notes (specific to the assessment) Side Note: Assessment centers typically do not evaluate how well a manager assesses an employee

5 Why I Like Assessment Centers If I m going to be assessed Direct observation Multiple instances Multiple people Training against common biases exactly what the assessment center method accomplishes.

6 Why I am Picking on Decision Making Core assessment center dimensions (Arthur, Day, McNelly, & Edens, 2003 Meta-analysis) Consideration/Awareness of Others Communication* Drive Influencing Others* Organization and Planning* Problem Solving/Decision Making* Four of these are responsible for overall criterionrelated validity Only one has a notable correlation with cognitive ability (Dilchert & Ones, 2009 Meta-analysis)

7 Aside: Decision Making vs. Problem Solving Interchangeability of Decision Making/Problem Solving: Newell & Simon (1972) Mintzberg, Raisinghani, & Theoret (1976) Borman & Bush (1993) Saaty (1994) Level 1 dimension of problem solving identified by Arthur et al. (2003) encompasses and is based on several Level 2 dimensions originally identified by Thornton and Byham (1982), including analysis, decisiveness, judgment, and technical and professional knowledge, all of which are also components of decision making. While some sources draw a distinction (e.g., O*Net), I do not within the AC context.

8 Why I am Picking on Decision Making The link to cognitive ability isn t surprising Problem solving in assessment centers is The ability to analyze and reason with information, as well as the ability to generate ideas and imaginative solutions (Dilchert & Ones, 2009) The extent to which an individual gathers information; understands relevant technical and professional information; effectively analyzes data and information; generates viable options, ideas, and solutions; selects supportable courses of action for problems and situations; uses available resources in new ways; and generates and recognizes imaginative solutions (Arthur et al., 2003)

9 Problems Associated With Cognitive Ability We already have a wide array of tests that measure cognitive ability cheaply, quickly, and easily Cognitive ability is considered static, while problem solving/decision making ability is more malleable (particularly relevant for feedback or developmental ACs) Cognitive ability is associated with adverse impact

10 Problems Associated With Cognitive Ability But assessment centers don t really show adverse impact OAR d =.5 (black/white) (Dean, Roth, Bobko, 2008) Inbox exercise d =.75 (black/white) (Roth, Bobbko, McFarland, and Buster, 2008) Cognitively demanding exercises have detectable adverse impact (d of nearly 1.0) (Thornton, Rupp, & Hoffman)

11 Problems Associated With Cognitive Ability Overall assessment rating.52 1,.56 2 Assessment center exercises In baskets/boxes.74 3 Technical work samples.76 3 Oral Briefing.22 3 Role play.21 3 Constructs measured in ACs Cognitive ability and job knowledge.80 3 Writing.79 3 Oral communication.27 3 Leadership/persuasion.27 3 Cognitive ability tests Moderately complex jobs.72 3 Low complex jobs.86 3 Trainability tests Work samples used with early stage applicants.73 3 Situational judgment tests ; Cognitively loaded scales.65 2 Interpersonal scales.07,.20,.50 4 ;.19 2 Leadership Dean, Roth, Bobko (2008) 2 Bobko & Roth (2013) 3 Roth, Bobko, McFarland, & Buster (2008) 4 Roth, Buster, & Bobko (2011)

12 Problems Associated With Cognitive Ability There is a range restriction problem associated with the AC method (cost, time, etc.) (LeBreton, Burgess, Kaiser, Atchley, & James, 2003) The nature of creating, deploying, managing, scoring, etc. assessment centers is changing If the method is good, and if it is becoming more affordable, it will be used more frequently

13 Why I am Picking on Decision Making I get it Problem Solving/Decision Making is a cognitive process, it s linked with cognitive ability measures, and that could be problematic Assessment center ratings are based on observable behavior In many cases, cognitive components have to be inferred (what the candidate must have done to arrive at a particular conclusion)

14 Why I am Picking on Decision Making Information gathering is often one of the only outward behaviors of problem solving/decision making (the other, obviously, being the actual decision) Business components have moved to digital/online platforms In-basket to Inbox Binders to PDFs Hand written notes to platforms like OneNote Filing cabinets to databases Everything happens on a computer So what is actually being assessed when we talk about Problem Solving/Decision Making in ACs?

15 Decision Making: Dual System Theory (Highhouse, 1997; Ferreira, Garcia-Marques, Sherman, and Sherman, 2006; Schneider & Schiffrin, 1997; Epstein, 1994; Hammond, 1996) Automatic ( System 1 ) Intuition Passive processing Large amounts of information Typically effective (thankfully ) Controlled ( System 2 ) Step-by-step Active processing Energy intensive Less information used Not mutually exclusive decisions can involve a blend of both systems (Ferreira et al., 2006; Hammond, 1996)

16 Decision Making: Dual System Theory System 1 (automatic) relies on mental shortcuts/heuristics to process lots of information quickly We use System 1 all the time For example

17 Decision Making: Dual System Theory Can you solve the following? 64 x 128 = OK, now actually do it that s System 2 kicking in.

18 Decision Making: Dual System Theory From Highhouse (1997) Which of the following situations is more likely to occur: 1. An all out nuclear war between the United States and China. 2. An all out nuclear war between the United states and China in which neither country intends to use nuclear weapons, but both sides are drawn into the conflict by the actions of a country such as Iran, North Korea, or Pakistan.

19 Problems with Problem Solving in ACs Volume and type of information presented to candidates Many real-world decisions, especially at higher levels, only have limited or questionable information (Bromiley and Rau, 2011) Unlike the real world, ACs have distinct boundaries (i.e., there is a set amount of information) Candidates with higher cognitive ability can rely on a single strategy: Review/remember provided information Use logic to deduce additional information Respond to a particular problem I ve just described a cognitive ability test (e.g., Watson- Glaser Critical Thinking Appraisal)

20 Problems with Problem Solving in ACs There s no way to tell if a candidate was using System 1 or System 2 So what? Cognitive load affects ability under System 2 but not System 1 (De Neys, 2006a) Working memory is related to System 2 processes but not System 1 (De Neys, 2006b) Again, so what? Field research indicates that in certain contexts (e.g., emergency situations, time contraints) decision makers do not rely on System 2 processes (Klein, 2008; Klein, Calderwood, & Clinton-Cirocco, 1986)

21 Problems with Problem Solving in ACs Currently, Decision Making/Problem Solving within assessment centers: Rely on methods already captured by and linked to cognitive ability Fail to capture decision processes that could more accurately capture genuine decision making proficiency Do not collect enough data to draw conclusions about a candidates process

22 Getting Into the Black Box How do we capture data within assessment centers that allows us to evaluate a candidate s decision making/problem solving process? Technology Exercise design (It s an AND, not an either/or )

23 Getting Into the Black Box : Technology AC components have to mirror common business technology in order to simulate the work environment (i.e., many tasks are done on computers) This actually makes decision processes more difficult for assessors to observe It makes components of the decision process easier to log automatically

24 Getting Into the Black Box : Technology Decision making research uses process tracing Collect as much data as possible In virtual environments (i.e., anything done on a computer) we can capture: Keystrokes Mouse clicks Events (e.g., opening a document, database searches) It s becoming easier to include: Physiological changes like stress Eye motion It s all captured automatically

25 Getting Into the Black Box : Design The data has to be meaningful Firmly linked to decision making research from cognitive psychology Exercises must be designed to elicit this meaningful behavior (e.g., Thornton, Mueller-Hansen, Leivens, and others) It s meant to be a supplement, not replacement

26 What We Are Doing Currently focused on single exercise (inbox) One exercise does not make an assessment center Time constraints Space constraints Assessor demands Participants take on the role of an HR employee They have access to: Contacts Employee records Organizational documents

27 What We Are Doing Exercise problems come in the form of s, which participants must respond to Problems are designed to stimulate very specific behaviors (e.g., searching for, processing, and comparing information about two employees) Any behavior that can be captured, is captured Information captured during the simulation is combined with typical information (e.g., actual responses) We are including physiological stress measurement

28

29 Example Problem

30 Example Problem

31 Example Problem Search through database to get employee information I know: Who you checked out How long you examined particular information Who you selected Translate decision to supervisor

32 What The Data Looks Like Keystrokes Actions/Clicks Stress/ Motion i, , 10/10/ :43:10 PM, , 10/10/ :43:10 PM c, , 10/10/ :43:10 PM a, , 10/10/ :43:10 PM m, , 10/10/ :43:11 PM, , 10/10/ :43:11 PM s, , 10/10/ :43:11 PM, , 10/10/ :43:11 PM, , 10/10/ :43:11 PM, , 10/10/ :43:12 PM, , 10/10/ :43:12 PM, , 10/10/ :43:12 PM, , 10/10/ :43:12 PM, , 10/10/ :43:12 PM I, , 10/10/ :43:13 PM, , 10/10/ :43:13 PM c, , 10/10/ :43:14 PM a, , 10/10/ :43:14 PM n, , 10/10/ :43:14 PM t, , 10/10/ :43:14 PM, , 10/10/ :43:14 PM s, , 10/10/ :43:14 PM e, , 10/10/ :43:15 PM e, , 10/10/ :43:15 PM m, , 10/10/ :43:15 PM, , 10/10/ :43:15 PM t, , 10/10/ :43:15 PM o, , 10/10/ :43:15 PM, , 10/10/ :43:16 PM f, , 10/10/ :43:16 PM i, , 10/10/ :43:16 PM n, , 10/10/ :43:16 PM d, , 10/10/ :43:16 PM, , 10/10/ :43:16 PM a, , 10/10/ :43:16 PM Tab EmployeeInformation Clicked, , 10/10/ :35:27 AM Software Checked, , 10/10/ :37:46 PM Tab EmployeeInformation Clicked, , 10/10/ :37:50 PM Tab PerformanceReview Clicked, , 10/10/ :37:50 PM Tab EmployeeInformation Clicked, , 10/10/ :37:52 PM Employee Sid Agarwal Selected, , 10/10/ :38:00 PM Employee Anna Andrews Selected, , 10/10/ :38:07 PM Employee Sid Agarwal Selected, , 10/10/ :38:10 PM Employee Heaven Habel Selected, , 10/10/ :38:11 PM Employee Qing Li Selected, , 10/10/ :38:13 PM Tab EmployeeInformation Clicked, , 10/10/ :38:14 PM Tab PerformanceReview Clicked, , 10/10/ :38:14 PM Tab DisciplinaryAction Clicked, , 10/10/ :38:16 PM Tab Organizational Documents Clicked, , 10/10/ :38:17 PM Tab Notes Clicked, , 10/10/ :38:19 PM Tab EmployeeInformation Clicked, , 10/10/ :38:20 PM Live Support Checked, , 10/10/ :38:23 PM Software Unchecked, , 10/10/ :38:23 PM Live Support Unchecked, , 10/10/ :38:25 PM Tab EmployeeInformation Clicked, , 10/10/ :39:29 PM Tab Organizational Documents Clicked, , 10/10/ :39:29 PM Software Checked, , 10/10/ :39:37 PM Talent Management Checked, , 10/10/ :39:39 PM Tab Organizational Documents Clicked, , 10/10/ :39:41 PM Onboarding Document Selected, , 10/10/ :39:46 PM Employee Handbook Document Selected, , 10/10/ :39:49 PM Performance Review Guide Document Selected, , 10/10/ :39:52 PM Tab DisciplinaryAction Clicked, , 10/10/ :39:54 PM Tab PerformanceReview Clicked, , 10/10/ :39:55 PM Tab EmployeeInformation Clicked, , 10/10/ :39:56 PM Tab PerformanceReview Clicked, , 10/10/ :40:52 PM Timestamp Accelerometer X Accelerometer Y Accelerometer Z GSR CAL CAL CAL CAL CAL msecs m/(sec^2)* m/(sec^2)* m/(sec^2)* kohms

33 What We Can See

34 What We Have Found (Broadly) Round 1 Initial target population involved undergraduates Comprised mostly of first year freshman (18 19 years old, little to no work experience) This was not a good population to target Virtually no variance in performance (everyone did very, very poorly) However: Different patterns in stress response Different patterns in problem behaviors (when they actually did something)

35 What We Have Found (Broadly) HR Masters students Good variance in our behavioral variables

36 What We Have Found (Broadly) Interesting results from GSR

37 What We Have Found (Broadly) Definite feedback/training opportunity based on behaviors we ve captured People vary wildly in their approach to the simulation

38 Moving Forward Continue to collect data Look for pattern clusters (i.e., similar behaviors from several people) Link simulation performance to personality, cognitive ability, and decisional processes Incorporate this exercise into a full-scale assessment center Link findings to OAR, dimension ratings, and actual job performance

39 Conclusion More focused definition of problem solving/decision making Avoid flashy-ness stigma often associated with newer technology Cannot simply be added on top of current exercises Could be larger benefit for development than selection

40 Questions?