4 Ways to Better Measure Corporate Training Results

I think results come out in lots of different ways, and some of them you measure, and some of them you feel.

In the January issue of TD magazine, SAP CEO Bill McDermott makes the point that training results aren’t always numbers-driven. I’ve seen this first-hand.

An India-based colleague has spent the last several years holding monthly training sessions focused on our company values and discussing “soft” topics such as teamwork and collaboration. When I dropped by one of these training sessions last month, one of her trainees commented: “In other organizations people try to pull other people down. Our organization is unique in that everybody tries to help each other and boost each other’s performance.”

Sometimes you can feel the results of a training program. But as I mentioned in Monday’s post, companies around the world spend over $75 billion (with a b!) annually and have no idea whether or not their training efforts have produced any results. This isn’t good.

If you happen to be interested in the ability to show other people (your boss, for example) that your training efforts don’t just feel good, but have made a measurable difference, here are four ways to do that:

1. Make sure you ask what should be different as a result of the training.

This one may sound like a no-brainer, but you’d be surprised at how many times training is planned and executed without specifically identifying what should be done new or differently or better as a result.

2. Pay some attention to Kirkpatrick’s Four Levels of Evaluation…

About 60 years ago, Donald Kirkpatrick espoused four “levels” of evaluation to assist training practitioners begin to quantify their results. First come post-training evaluation scores (“smile sheets”), then learning (most of the time through pre/post testing), then skill transfer on the job (maybe a self-reported survey, or a survey from a trainee’s supervisor) and finally impact (did sales increase? did on-the-job safety accidents decrease?). Levels 1 and 2 are most common, but trainers and organizations can certainly strengthen their Level 3 and 4 efforts.

3. …and then go beyond Kirkpatrick.

According to a research paper entitled The Science of Training and Development in Organizations, Kirkpatrick’s Four Levels is a model that can be helpful, but there is data to suggest it is not the be-all-and-end-all that training professionals have pinned their evaluation hopes on. The authors of this paper offer the following example as a specific way to customize the measurement of a training program’s success or failure:

“If, as an example, the training is related to product features of cell phones for call center representatives, the intended outcome and hence the actual measure should look different depending on whether the goal of the training is to have trainees list features by phone or have a ‘mental model’ that allows them to generate recommendations for phones given customers’ statements of what they need in a new phone. It is likely that a generic evaluation (e.g., a multiple-choice test) will not show change due to training, whereas a more precise evaluation measure, tailored to the training content, might.”

4. Continue to boost retention while collecting knowledge and performance data.

Cognitive scientist Art Kohn offers a model he calls 2/2/2. This is a strategy to boost learner retention of content following a training program. Two days after a training program, send a few questions about the content to the learners (this can give data on how much they still remember days after having left your training program). Two weeks later, send a few short answer questions (again, this helps keep your content fresh in their minds and it gives you a data point on how much they’ve been able to retain). Finally, two months after the training program, ask a few questions about how your content has been applied on the job (which offers data on the training’s impact).

If companies as spending billions of dollars on training, never to know whether or not those efforts were effective, there’s a problem. Spending a few hours thinking through your evaluation strategy prior to deploying your next training program can make your efforts literally worth your time.

 

How Do You Know If Your Training Is Effective?

“How do you know if your training actually has an impact?”  It’s a question I hear often, especially regarding soft skills training.  It all starts with a needs assessment.  When I’ve led teams, the easiest way I’ve found to assess needs, recommend training and then measure results is through a professional development plan (PDP).  If training isn’t tied to a need, if it’s not written down and if an employee isn’t held accountable for improved performance, the impact of the training will not be fully realized.

Here is a generic version of a professional development plan based upon one that I’ve found to be quite effective.

04292013 - PDP

I like this PDP format because it illustrates how metrics and key performance indicators should be directly tied to soft skills.  Metrics and numbers tell a story, but what is that story?  Are performance numbers down because of team dynamics and dysfunction?  Then perhaps a focus on teambuilding skills would be appropriate.  Are quarterly results suffering because team members haven’t established the correct priorities?  Perhaps time management is an area that needs to be improved.

Identifying baseline performance metrics, identifying appropriate learning opportunities (training on hard or soft skills) that should impact those performance metrics, then monitoring those performance metrics and results to identify if there has been improvement in those metrics is how I can feel confident that training is effective.

I’ll write it again: if training isn’t formally tied to a need, however, its full effectiveness will not be felt.

The Train Like A Champion Blog is published Mondays, Wednesdays and Fridays.  If you think someone else might find this interesting, please pass it along.  If you don’t want to miss a single, brilliant post, be sure to click “Follow”!  And now you can find sporadic, 140-character messages from me on Twitter @flipchartguy.