Accenture Analytics November 2013
Today, we re speaking analytics within and across all industries with an accent on differentiation and business services Products Resources Financial Services CMT H&PS Cross-Industry Accenture - We Speak Analytics: http://www.youtube.com/watch?v=vg6of-042x4 2
The Power of Assessment and Development Center in Maximizing Talents A Walk-Through for Talent Developers 27 November 2013 3
There is a surge in the use of predictive analytics, and a shift toward data-driven insights Three times as many companies use predictive analytics now vs. 2009 But, desire exceeds capability and demand exceeds supply 60% say internal customers want prediction of trends, not just reporting on them 25% of companies rely on data to a great extent for new ideas, double that from 2009 62% say quicker, more effective decision making is a top priority for use of analytics Only 39% say the data they generate is relevant to the business strategy Note: Findings from Accenture s Analytics Adoption Research, 2013 (US, UK) 4
C-level executives recognize that data is an essential asset and that analytical insights drive business performance High level of commitment Open to external partnerships 2 out of 3 appoint Chief Analytics or Chief Data officer 7 out of 10 are open to external insight providers 2 out of 3 using Predictive Analytics to run business 8 out of 10 recognize that they will have to tap external data sources Recognition of gaps to achieving value 1 out of 5 believe they are getting the ROI from analytics Only 2 out of 5 believe that their organization has access to the right data Clear lack of alignment to core business processes Source: Accenture research with analytics practitioners, released March 2013, Analytics in Action: Breakthroughs and Barriers on the Journey to ROI 5
Assessment Center Defined An assessment center consists of a standardized evaluation of behavior based on multiple inputs. Multiple trained observers and techniques are used. Judgments about behaviors are made, in major part, from specifically developed assessment tools and activities These judgments are pooled in a meeting among the assessors. Source: Guidelines and Ethical Considerations for Assessment Center Operations. Task Force on Assessment Center Guidelines; endorsed by the 17th International Congress on the Assessment Center Method, May 1989. 6
Essential Features of an Assessment Center Multiple observations made for each dimension Multiple assessors used for each candidate Assessors trained to a performance standard Systematic methods of recording behavior Assessors prepare behavior reports in preparation for integration Integration of behaviors through: Pooling of information from assessors and techniques consensus discussion 7
8 Expected Outcomes Determine the individual and group profiles of the managers for BPO and Technology Identify the differentiating factors between high and low potentials Identify the significant strengths and developmental areas per batch Determine initial benchmarks for the differentiating competencies Determine the other growth and derailing factors demonstrated by the managers beyond Accenture s competencies Propose training and non-training interventions for identified developmental areas Evaluate the Assessment Centre methodology and process and how it can be improved in the future 8
PROGRAM METHODOLOGY
THREE PHASES 10
Phase 1: Pre-Workshop Program Development & Customization Competencies and Assessment Methodologies 11
GROWTH FACTORS Leadership Promise Propensity to Lead, Brings Out the Best in People Personal Development Orientation Receptivity to Feedback, Learning Agility Mastery of Complexity Balance of Values and Results Adaptability, Conceptual Thinking, Navigates Ambiguity Culture Fit, Passion for Results 12
DERAILING FACTORS Typically self-confident and see themselves as leaders but they often fail to listen or understand their own limitations, eventually becoming despotic. Overconfidence Overdependence Micromanagement Iconoclasm Usually viewed as agreeable and easy to work with, as good followers; but they may also be risk-adverse, lacking in influence and weak when faced with high demands. Tend to be good administrators, methodical and attentive to detail; but they may be inflexible and rule-following and tend to try to manage others too closely. Likely tough-minded and able to break with conventionality; but they may be insensitive to others, even anti-social and unethical in their behavior. 13
Assessment Activities 1) Online Assessment 2) Solo Case Study/In-Basket 3) Team Activities 4) Dyads 5) Presentation 6) Behavioral Event Interviewing 14
Scoring Methodology of SHL Online Assessment Tools SHL uses percentile ranking to score the performance of a candidate on each test (international norm) Percentile Ranks are a way of describing how an individual s test performance compares to performance of other people on the same test It is a number that reflects a percentage of scores that are below the score in question Example Candidate Score: 90 th Percentile Interpretation: Examinee scored higher than 90% of all test takers for that test 15
Personality Profile 1 2 3 4 5 6 7 8 9 10 RELATIONSHIPS WITH PEOPLE Persuasive Self-Control Empathic Modest Participative Sociable THINKING STYLE Analytical Flexible Innovative Structured Detail Conscious Conscientious FEELINGS AND EMOTIONS Resilience Competitive Results Orientated Energetic January 2007
Employee Motivation sample report 17
Customization Activity: Collaboration with Subject Matter Experts to identify actual demonstrations of competencies (+ and/or -) on the job. Output: Cases for assessment tools and exercises that have industry-specific scenarios. 18
PHASE II: CONDUCT OF THE ASSESSMENT CENTER Total no. of days: 2 Carried out away from the job setting Series of multiple assessments Multiple observers Each participant is observed by as many of the observers. 19
Assessment Workshop Interview Role Plays Team Activities 20
Conduct of the Workshop Workshop (max of 8 pax per assessment day) Time Activity 8:00 10:00 Case Study 10:00-11:00 Team Activity 1 (Mancom Mtg) 11:00-11:30 Team Activity 2 (Investment Activity) 11:30-12:00 Lunch Break 12:00-12:30 Coaching (Assessees 1, 3, 5 and 7) 12:30 1:00 Coaching (Assessees 2, 4, 6 and 8) 1-2:30 Presentation/Interview (Assessees 1, 3, 5 and 7) 2:30-4:00 Presentation/Interview (Assessees 2, 4, 6 and 8) 4:00-5:00 Calibration 21
Phase 3: Post-Workshop Report Generation & Program Evaluation 22
One on One Feedback Session The results of the assessment are communicated to the immediate superiors and participant. Discussions on: Strengths Developmental Areas Recommended Avenues for Development The participants are also invited to comment on the assessment. Debrief Report 23
Recommendations
Suggested People Measurement Components Performance/Results: Past measures 1. Performance metrics 2. Track record Results Competencies e.g. 1. Behaviours 2. Skills Potential Competencies Potential e.g.: 1. Motive 2. Personality 3. Values 4. Cognitive Business Strategy 25
Visual Decision Board Person 1 Person 2 Results Results Competencies Potential Competencies Potential Business Strategy Today s Provider Tomorrows Star Below average / Development Area Average Above average / Strength Area 26
The power of three..the Talent Board R&D Customer Management Sales & Marketing Organisational TP MV leadership Senior Manager SK BB SJ JP Middle Manager HF MF ML BM AS FB FW PS Operational Management PB SC NT JN SM ZP JB KE IR SB LC VS AT MC 27
I am convinced that nothing we do is more important than hiring and developing people. At the end of the day you bet on people, not on strategies. Larry Bossidy Execution: The Discipline of Getting Things Done 28
MEASUREMENT MATTERS: If you cannot measure it you cannot manage it. Robert Kaplan and David Norton 29