Effectively demonstrate the value of your training by steering clear of these evaluation mistakes.

Similar documents
APRIL Training evaluation doesn t have to be as complicated as you think. $19.50

creating a culture of employee engagement

Beyond the First 90 Days

Innovative Marketing Ideas That Work

The best Paralegal interview questions you ve not been asking

PERCEPTION IS REALITY : A RECRUITER S GUIDE TO GOING FROM ORDER TAKER TO TRUSTED ADVISOR

Net Promoter Score for Recruiters

Managers at Bryant University

CONTINUAL SERVICE IMPROVEMENT: BRINGING IT TO LIFE

MANAGING PEOPLE: A LOST CRAFT

Functional Area Assessment FAQ

The Meaningful Hospitality Smart Hiring Guide

PERSONNEL (DIS)PARITY CHIEF TALENT DEVELOPMENT OFFICER THE PERFECT FIT CTDO WINTER 2016 ATD COPYRIGHT 2016

They must be leading, not lagging. A predictive view allows for course corrections if the indicator is below expectations.

Demonstrating Positive Elearning ROI

The slightest perception of something negative happening can affect an employee s emotional state.

UAB Performance Management 07/03/2018. Title Page 1

THE POWER OF ONLINE REVIEWS. Free ebook BACK TO CONTENTS. thrivehive.com

How Does an Executive. Coaching Engagement Work? ArdenCoaching.com

Coaching for Talent Development and Employee Engagement

PMP Practice Questions

30 Course Bundle: Year 1. Vado Course Bundle. Year 1

IMPROVEMENT SKILLS CONSULTING LTD. Simply, improvement. You don t need a Culture Change Programme

Show value by learning to align initiatives, collecting data, and demonstrating the impact on business results. HUMAN CAPITAL. 30 T+D September 2013

SUPPORTING ARTIFACTS. Definition

Holding Accountability Conversations

Measuring to demonstrate. Which metrics should you use and when?

Management And Operations 593: Organizational Politics. Managerial Leadership and Productivity: Lecture 6. [Ken Butterfield]

ARE YOUR CUSTOMER SERVICE METRICS TELLING THE TRUTH? Many rank frontline teams unfairly.

Leadership Skills for Managing Conflict, Culture & Change

Research Report: Forget about engagement; let s talk about great days at work

FOUR SOCIAL MEDIA TACTICS EVERY REAL ESTATE AGENT NEEDS

10 Key Components for a Winning Candidate Experience

The Top 5 Reasons Successful Doctors Utilize Call Tracking & Call Recording to Boost Total Patient Appointments by 10-50%

HOW TO WRITE A WINNING PROPOSAL

Putting non-service employees on the phones

ebooklet How to improve your CV and interview technique using your Belbin Team Role Report

Enhanced Employee Health, Well-Being, and Engagement through Dependent Care Supports

Podcast: Meghan Murphy - Financial Wellness Markers

Small business guide to hiring and managing apprentices and trainees

SOFTWARE STAKEHOLDER MANAGEMENT- It s not all it s coded up to be

Subject: Follow-up to your question about your home s value

A CEO s perspective. The successful non-executive board member is passionate and engaged without stepping over the critical line

Advice on Conducting Agile Project Kickoff. Meetings

Table of Contents. 3 About This Research. 4 About ReferralExchange. 5 Executive Summary. 7 Detailed Survey Findings. 14 Agent Commentary

By: Aderatis Marketing

COACHING I 5. BUSINESS MANAGEMENT COACHING TIPS & STRATEGIES The Influence of the Human Resource Department

USING PR MEASUREMENT TO BEAT YOUR COMPETITORS: A HOW-TO GUIDE

INTRODUCTION THE PROBLEM AND ITS CONSEQUENCES

Creating a Motivating Workplace Course Workbook. The National Food Service Management Institute The University of Mississippi

2.3 Identifying all stakeholders i.e. all direct or indirect stakeholders

COURSE CATALOG. vadoinc.net

BECOMING A Data-Driven. HR Function

erhaps you ve heard this quote by 18th century author Oliver Goldsmith: Our greatest glory is not in never failing, but in rising up every time we

All for One and One for All:

Beyond the Boundaries

Bring Out the Best in Your People

LEADER. Develop remarkable leaders who deliver amazing results

More than Mobile Forms Halliburton s Implementation of an End to End Solution

OVERVIEW. We are talentsmoothie organisational development consultants. talentsmoothie.com

DEVELOP YOUR ANNUAL INNOVATION STRATEGY IDEASCALE WHITE PAPER

Chapter 1 About Contact Channels

IMPROVEMENT SKILLS CONSULTING LTD. Simply, improvement. SMART Planning

Why Do So Many Online Businesses Fail?

Six Steps to Improving Corporate Performance with a Communication Plan

THE FRANCHISE ONBOARDING PLAYBOOK

Top 10 Marketing Mistakes Even the Smartest Companies Make And How You Can Avoid Them

Emi I ve always loved animals. After graduating with a Masters Degree in Creative Writing in 2012, I went through a quarter-life crisis of not

A Guide for Writing S.M.A.R.T. Goals

How Do I Find the Right CRM for My Business?

The Financial and Insurance Advisor s Guide to Content Writing

EMPLOYEE, ENGAGE THYSELF (A White Paper)

The E-Myth Revisited Why Most Small Businesses Don t Work And What to Do About It

Effective Performance Evaluations

The Future Of Social Selling

Organizational Change Management for Data-Focused Initiatives

Visionary Leadership. A leadership style to get your team aligned toward achieving your vision.

Become A Change Champion

This Week in the Boardroom 12/04/14 Page 1

Sales and Operations Planning Overview

Business Result Advanced

NUANCE COMMUNICATIONS CUSTOMER SUCCESS STORY

Differentiation. The SunTrust Guide to Competitive Strategy 1

VARIOUS TYPES OF INTERVIEWS

Redefining Corporate Communications Success in the C-Suite.

Electronic Health Records in a System of Care Setting: Lessons Learned from the Field

EBOOK 7 WAYS POOR LEADERS ARE COSTING YOUR COMPANY MONEY

The Goals Grid. A Versatile, Multi-Purpose Tool. Fred Nickols 8/8/2011

7 MISTAKES MOST LOCAL BUSINESSES ARE MAKING WITH THEIR ADVERTISING

Change Catalysts Case Study

Communicate business value to your stakeholders

Most organizations spend

Price Reductions: The Bottom Dollar Script Page 1

PROMOTING FINANCIAL WELLNESS SOLUTIONS IN THE WORKPLACE

Q&A from the PSMJ Resources, Inc. / XL Group Webinar on September 18, 2012: Developing Satisfied Clients: 6 Steps That Can Save Your Assets

Oral History Program Series: Civil Service Interview no.: F8. Mary Theopista Wenene

The Five Essential Practices of the Data- Savvy Organization

Creating Kick-Ass Engagement Plans for Your Key Accounts

32 BETTER SOFTWARE JULY/AUGUST 2009

Integrating the Seller-Doer Model with the Dedicated Business Developer Model for General Contractors

Transcription:

LEARNING & DEVELOPMENT Effectively demonstrate the value of your training by steering clear of these evaluation mistakes. 36 TD November 2016 PHOTO: THINKSTOCK

podcast EVALUATION BLUNDERS & MISSTEPS TO AVOID BY JAMES D. KIRKPATRICK AND WENDY KIRKPATRICK The four levels is the most common training evaluation approach, but that doesn t mean it has been implemented correctly over time. On the contrary, misconceptions and misapplication have reduced the simple effectiveness of the model. Here is a summary of the most common training evaluation mistakes. Are you making them? Mistake #1: Addressing evaluation requirements after program launch Many training professionals mistakenly design, develop, and deliver a training program, and only then start to think about how they will evaluate its effectiveness. Using this approach nearly guarantees that there will be little or no value to report. We received a phone call from a consultant a few years back. He was quite proud to tell us about the multimillion-dollar leadership development program he had created for a large corporation. He worked with the company to define its needs before developing the three-year program, which was nearing the end of the first year. He was contacting us to find out if we wished to join the project as evaluation consultants, because they had data from the first year of program participation. November 2016 TD 37

We asked just a few questions to verify our suspicion; they had not pinpointed the specific company metrics they hoped this large investment would positively impact. Of even more concern, they had not identified exactly what the managers involved in the program should be doing to influence those metrics, nor had they prepped senior managers to coach and monitor performance. Ultimately, they had created a nice-to-have program containing a laundry list of development activities targeted to nothing in particular. We had no choice but to tell this wellmeaning consultant that there was little we could do to help them other than recommend that, as quickly as possible, they create an effective program plan that includes metrics and performance standards, and see if there is anything they can salvage from the current misguided program. To avoid this pitfall, programs should begin with a focus on the Level 4 results you need to accomplish. This automatically focuses efforts on what is most important. Conversely, if you follow the common, old-school approach to planning and implementing your training program, by first thinking about how you will evaluate Level 1 (reaction), then Level 2 (learning), then Level 3 (behavior), it s easy to see why few people get to Level 4 in this fashion. Set yourself apart from and ahead of the crowd by using the four levels upside down; start every project by first considering the key company metrics you plan to influence and articulate how this will contribute to the Level 4 result of your organization. Then think about what really needs to occur on the job to produce good results (Level 3). Consider next what training or other support is required for workers to perform well on the job (Level 2). And finally, consider what type of training will be conducive to imparting the required skills successfully (Level 1). Mistake #2: Spending the majority of evaluation resources on Levels 1 and 2 The 2016 ATD report Evaluating Learning: Getting to Measurements That Matter polled 199 learning professionals who revealed that they invest nearly 70 percent of their training evaluation resources in Levels 1 and 2. Sadly, this statistic did not improve from ATD s previous report in 2009. This old-school approach of spending heavily on effective training leaves few resources for the more important job of ensuring training effectiveness at Levels 3 and 4. Generally speaking, Level 3 is the most important level to not only evaluate, but also invest in for any important program. Without on-the-job application, training has no hope of contributing to organizational results and, therefore, is of little value to the organization. If your program is important enough to have a Level 3 plan, then it is also important enough to have evaluation of Level 4 results. Level 1 is the least important level. Of course you want to know that the training was well-received, but consider how much of a resource investment it is worth to gather this data. The investment should be quite small. Focus on formative methods that occur during the program itself, and only formally evaluate the few key items you plan to analyze and use. The evaluation of Level 2 is important to ensure that participants leave a training program prepared with the required knowledge and skill. However, proper Level 2 evaluation can be built right into the design of a program and should, therefore, not become an evaluation resource priority. Now that you ve saved resources for Levels 3 and 4, what level of effort and resource investment can you expect to devote to each? Level 4 is actually the simplest and least resource intensive to evaluate. If something is a true Level 4 result, it is important enough that someone in the organization is already measuring and monitoring it, and it is simply a matter of obtaining the data. What is more difficult is to find the connection between training, on-the-job performance, and organizational results. In many evaluation plans, the missing link is Level 3. A strong Level 3 plan recommended for important initiatives will be resource intensive; however, evaluating Level 3 is not as expensive 38 TD November 2016

as some would think. When tools and systems are constructed at the same time as the program itself, and ultimately viewed as part of the program, this simply reallocates the resources from instructional design to performance support. Mistake #3: Relying solely on standardized surveys Some believe in the existence of a miracle survey that will give you all of the training evaluation data you need. Don t buy it. For mission-critical programs, it is important to employ multiple evaluation methods and tools to create a credible chain of evidence showing that training improved job performance and contributed measurably to organizational results. For less important programs, you will want to be equally careful about selecting the few evaluation items you require. Surveys, particularly those administered and tabulated electronically, are a wonderfully efficient means of gathering data. However, response rates tend to be low, and there is a limit to the types of information that can be gathered. It is so easy to disseminate these surveys that they are often launched after every program, no matter how large or small. The questions are not customized to the program or the need for data, and people quickly pick up on the garbage in-garbage out cycle. This creates survey fatigue and makes it less likely that you will gather meaningful data for any program. For mission-critical programs in particular, gather both quantitative (numeric) and qualitative (descriptive) data. Open-ended survey questions can gather quantitative data to some degree, but adding another evaluation method provides better data. For example, a post-program survey could be administered and results analyzed. If a particular trend is identified, a sampling of program participants could be interviewed and asked open-ended questions on a specific topic. Depending on the rigor required by your stakeholders, you may be able to obtain good interview data by simply calling or briefly visiting training participants and asking them a question. Don t be too intimidated to integrate this human element into your program evaluation data. An often-overlooked source of evaluation data is formative data. Build touch points into your training programs for facilitators to solicit feedback, and ask your facilitators for their feedback via a survey or interview after the program. Mistake #4: Not using collected data Have you ever inherited an office that has a precariously tall stack of papers in one corner, perhaps next to a stuffed full file cabinet? Wendy did, and upon closer inspection, she saw that it was years and years of old training evaluation forms. LEVEL 3 IS THE MOST IMPORTANT LEVEL TO NOT ONLY EVALUATE, BUT ALSO INVEST IN FOR ANY IMPORTANT PROGRAM. The Kirkpatrick Model Level 4: Results The degree to which targeted program outcomes occur and contribute to the organization s highestlevel result Level 3: Behavior The degree to which participants apply what they learned during training when they are back on the job Level 2: Learning The degree to which participants acquire the intended knowledge, skills, attitude, confidence, and commitment based on their participation in the training Level 1: Reaction The degree to which participants find the training favorable, engaging, and relevant to their jobs November 2016 TD 39

FOR MISSION-CRITICAL PROGRAMS IN PARTICULAR, GATHER BOTH QUANTITATIVE (NUMERIC) AND QUALITATIVE (DESCRIPTIVE) DATA. Besides a poor document retention system, the bigger problem this indicated was that the evaluations had not been properly analyzed, and findings were not appropriately integrated into program enhancements and performance support efforts. When Wendy asked around the department, multiple individuals commented that there was never time or resources to tabulate the evaluations, so a quick flip through them by the facilitator or the boss was all that ever really happened. Someday someone would enter the data into some type of system so that it could be quantified and analyzed, but someday had not occurred for years. When you survey a group of individuals, you are making an implicit agreement with them that you will act on their aggregated feedback. Continuing to disseminate surveys when the participants can clearly see that you are doing nothing with the data will quickly create the expectation that nothing ever will happen with their feedback, and they will stop giving it. At that same organization, Wendy was asked to create an evaluation form to use after a week-long event in which new products and updates were launched to the sales and customer service team. She included questions about the program quality and content, the meeting facilities, and how the participants felt about selling the new products. Wendy was present when the vice president of marketing, who organized the event, first reviewed the evaluations. His commentary went something like this: Joe said that the new product will not sell in his market because the color scheme doesn t work with Midwestern homes. Maybe that s why his sales are so low. Sue complained about the food. We can t make everyone happy. A few people said it was too cold there s nothing we can do about the temperature of a hotel ballroom. You get the point. He had a reason to dismiss every comment for one reason or another. Future meetings didn t change, nor did the questions on the evaluation form. No response to legitimate product concerns, such as inappropriateness of a product for a given market, was issued. The result? Each event received less and less feedback, or the infamous line down the side of the page to select all fives. The vice president of marketing was satisfied with this; he assumed that meant that everyone was happy, or at least they were off his back. Start with the end You should be interested in truly learning from your program participants, and want to continually improve your programs to assist them in successfully performing in their jobs. Make sure you can and do review any evaluation data you receive, and make a point to show how it s being used. If you start your programs with the end in mind and build meaningful evaluation into the plan, you have taken important first steps to creating and demonstrating the value of training to your organization or clients. James D. Kirkpatrick is a co-owner of Kirkpatrick Partners. He is a thought leader and change driver in training evaluation and the creator of the New World Kirkpatrick Model. His latest book, co-authored with Wendy Kirkpatrick, is Kirkpatrick s Four Levels of Training Evaluation (ATD Press); jim.kirkpatrick@kirkpatrickpartners.com. Wendy Kirkpatrick is a global driving force of the use and implementation of the Kirkpatrick Model, leading companies to measurable success through training and evaluation; wendy.kirkpatrick@kirkpatrickpartners.com. 40 TD November 2016