Tuesday, May 13, 2014

Evaluating the business impact of learning.

As my regular reader(?) will know I was at the ASTD 2014 conference in DC this week.
On the plane back west I thought I'd put in some thoughts on one of the sessions I attended while I was there. With the news of the passing of Don Kirkpatrick this is particularly relevant.

The session was led by Ken Phillips.  Ken is an old colleague from my CCASTD days. He is a past president of the chapter and a regular presenter both there and at the International Conference. 

I remember seeing Ken present at the Chicago conference back in 2010 on creating Level 1 evaluations that went beyond reaction. After seeing that I had to completely rewrite my level 1 evaluations but as what I got out of them was so much more useful, I forgave him for that.

A brief side note for anyone not familiar with training evaluation levels. 
The great Don Kirkpatrick wrote a series of articles for ASTD's magazine some 40 years ago. There he defined 4 levels that can be evaluated. I first heard about them at a conference several years back from Bob Wilson who supplied the subtitle explanations I placed in parenthesis below.

Level 1. Reaction (Did they like it)
Level 2. Learning (Did they learn it)
Level 3. Behavior. (Are they using it) 
Level 4.Results (Is it making a difference) 

If you have ever had any workplace training you almost certainly encountered some sort of level 1 evaluation, often discounted by learning professionals as smile sheets. 

In addition to these 4 levels, Jack Phillips has also added a 5th, ROI. ( Personally I see this as just a part of level 4 )

OK,   back to the session.
The session title was. Business Results Made Visible: Design Proof Positive Level 4 Evaluations.

Learning pros will tell you that level 4 is the hardest to get to and will have all sorts of reasons why they just can't provide it. So I was happy to see some stats that Ken provided in his session on how much level 4 is actually being done.

According to some notes I tweeted out, the survey said that 37% of organizations say that they measure at least some of their training programs at level 4. Quite frankly I found that number unbelievably high. My experience has generally been that rarely if ever has a program I've been involved in actually had any level 4 evaluation done. But then I started thinking. The majority of the work I've done over the last few years has been on behalf of software vendors who provide training for the products they use. In such case we, the vendors, are (or at least should be) responsible for levels 1&2 and in some cases level 3 if we have the ability to observe or measure how they use the product we just trained them on. Level 4 in those cases is in the hands of the organizations who's employees we just trained. 

So it looks like many of those companies are taking their responsibility seriously, as are my colleagues who run internal programs. 

So the next question and statistic is about how much training is being evaluated at level 4.  Well, the 37% that do any level 4 do so on about 15% of the programs they run. That seems low but on sober reflection it looks like there are some careful choices being made. Ken pointed out some best practice criteria in use for choosing which programs to evaluate and the key point is it's the ones that are of strategic importance to the organization that are the ones that make the most sense to invest in level 4 evaluation of.  The other important factor is that the metric that the training is targeted against is one that the people doing the evaluation have a realistic chance of having access to. Many companies have  a tight reign on some of their most important metrics and may not be keen to share that with the (often very) junior levels of management that might be leading the L&D function. So pick data that you can have the needed level of access too. 

But the good news in the world of big data is that much of that data about how people work is available if not already being collected. And another interesting statistic is that 75% of the organizations collecting data points as part of a level 4 evaluation see the data as having high or very high value. 

And that is good news all around. Not just for the Learning Pros out there but also for CEOs. In another recent survey CEOs were asked if they wanted to see the numbers on the business impact of training programs. 96% of them said they did want to see it.  But only 8% of them say they got to see this information. ( I think the question also had something to do with ranking how important that information was and that the 96% may have been how many of them ranked it at #1 but I'm not entirely sure - Long week!)

One question I will need to follow up with Ken is why the large discrepancy? If 37% do level 4s but only 8% of CEOs see the results then something is missing. 

So encouraging news on evaluation of business impact, but the learning profession still has a way to go on this to satisfy the wishes of our CEOs. 

Happy learning. 
And evaluating.

And Rest in Peace Don.


No comments:

Post a Comment