We must stop measuring learning

YardstickWe must stop measuring learning for one simple reason. The only two people who care about your learning metrics are you and your mom. And honestly, she doesn’t care. She’s just pretending to be interested.

An executive team meeting at ABC Inc.

Imagine being a fly on the wall at the end-of-year wrap up meeting of the executive committee at ABC Inc. The CEO says, “This is the time. This is the place. We need to know where we stand so we can plan our strategy for next year. I need you all to report out for your operational function.”

The VP Marketing goes first. “We increased our budget and we have purchased space in 50% more trade journals this year! In addition, we upped our Google Ad budget by 200% and have placed ads throughout the Internet.”

The VP Sales is next. “I’m happy to report to you all that an audit of our sales calls shows that 93% of sales calls used the SPIN selling model. Not only that, but our inside sales representatives handled, on average, 12% more calls than they did last year!”

The CFO then reported that accountants’ sick leave was down 15% resulting in a departmental cost savings of 5%, and that the new accounting system was in place and being used very effectively.

It would then come as no surprise that the CLO reported that 100% of employees took all required compliance courses; that there was an average 90% satisfaction with training, and that of the 600 courses in the catalog, 75% had been utilized by 15% or more staff, who viewed them for an average of 15 minutes each.

It should also come as no surprise that either all these executives were fired, or the company went out of business, because no one had their eye on the business results.

Of course activity-based reporting by a CFO or VPs of sales and marketing such as that outlined above is ludicrous and doesn’t really happen in well-functioning business. The CEO wants to know about sales figures, not how many people use the SPIN selling model.

Unfortunately, while activity-based reporting by the CLO is equally ludicrous, it’s all too often the type of reporting that the metrics generated by the learning organization support. The fundamental failure to talk the language of business is why – as I’ve argued above – training is all too often the first to go.

We must to better. We can do better. We simply have to put our minds to it.

Kirkpatrick revisited

It’s somewhat fashionable to poo-poo Donald Kirkpatrick’s four levels of learning evaluation. Okay, okay, I guess that there are newer, shinier models of learning assessment. But Kirkpatrick’s is an extremely workable model.

For those of you who are not familiar with it, Kirkpatrick suggests there is a hierarchy of evaluation methods.

  • Level 1 evaluations ask for learners’ reactions to a training course. Did they like it? This is the familiar “smile sheet” we often get after a training course.
  • Level 2 assesses whether, in fact, the person learned something. Did they retain the knowledge? This is often assessed using a post test to see how much of the content was retained.
  • Level 3 looks at whether they can do their job better as a result of training. Did they transfer the learning from the classroom into the work site?
  • Level 4 asks a different question – even if they can do their job better – does that lead to a positive result for the business.

To this, Jack Phillips added a widely accepted Level 5 measuring return on investment.

We won the battle but lost the war

Many learning professionals argue that Level 4 and Level 5 are the ultimate goals for learning. I disagree. My point of view about this came from an experience I had a number of years ago.

We were training insurance underwriters, who in effect were in the income-producing side of the business. We had had a real success – by all measures, the underwriters were able to underwrite more business with less effort than they had in the past. When we did a debrief with the VP of underwriter training, however, he was not happy. Apparently profitability was down. The folks who wrote underwriting policies had written policies and guidelines that were too “soft,” and loans were being approved that had a high default rate.

We trained the underwriters to use the guidelines they were given extremely effectively, and the more effectively they used the faulty guidelines, the worse it was for the business.

No one blamed us – thank God – but at the end of the day, profits were down, headcount had to be cut, and management was unhappy.

Level 3: The gold standard

What I took away from this, and what I continue to believe, is that for learning professionals, Level 3 is the gold standard. We “contract” with managers to help people do their jobs better, according to the policies, procedures, and guidelines they are given. We do not contract to produce business results. It’s a fine distinction, but we can’t contract to produce business results. There are simply too many other variables related to the market, the product, customer service, etc.

That’s why I believe that our job is to demonstrate a Level 3 result – can they apply their skills and knowledge to do their jobs better. This is where the sweet spot for learning professionals is. This is what – if we are thinking about results – we promise our customers. This manager will be able to give more effective presentations. That project manager will be able to manage risks and issues effectively. This underwriter will be able to use the guidelines she’s given and underwrite business effectively.

We must measure productivity

At the meeting of ABC Inc.’s executive board, the reports should look like this:

The VP Marketing says that market research shows that brand awareness is up 35% in the critical 18-25 male market.

The VP Sales says that sales are up 10%.

The CFO lets folks know that the company made a profit.

The CLO lets everyone know that the VP Sales indicated that 100% of sales trainees were able to handle simple to moderately complex accounts without supervision, and the executive team has endorsed the readiness of five high potential candidates to move into AVP slots should the need arise.

The CLO is now talking the language of business, and reporting results that everyone – not just his mother – is interested in.

Abstracted from a forthcoming book on learning effectiveness  (c) Bill Bruck, Ph.D., 2015

RSS Feed



Find it!

Theme Design by