Home The LEO Learning Blog

Re-evaluating Evaluation at Learning Technologies

EOI was fortunate to attend two excellent sessions at this year’s Learning Technologies conference. In this blog post I share the highlights of Jim Kirkpatrick’s Re-evaluating Evaluation: showing the business value of training.

For anyone new to L&D (or living under a rock for 60 years!), Jim Kirkpatrick is the son of Donald L Kirkpatrick who developed the Kirkpatrick model for his PhD dissertation in 1954.

The model consists of four levels to evaluate the effectiveness of training and although it’s widely known and used, most L&D professionals believe it to be flawed. I get their point. While levels one (learner satisfaction) and two (increased learner knowledge) are the most straightforward areas to measure, they are not the most important. What do they really tell us about a learning intervention’s impact on an organisation? They reveal nothing about improvements to quality, customer service or cost efficiencies. So, it was great to hear Jim acknowledge this flaw early on and explain how his dad’s model had been misconstrued while it rocketed in popularity. Everyone assumed that level one was the starting point, but 89-year old Donald is still reminding Jim to ‘start with level four’!

Jim introduced the new world Kirkpatrick model which addresses this issue and shifts the focus to levels three and four. This adjusted approach to evaluation mirrors what we practise (and preach) at Epic. Let’s take a look at three practical suggestions for applying this approach:

1. Starting with your desired results is critical to success.

Clients and colleagues may say that I bang on about learning outcomes – exactly what do you expect learners to do differently as a result of your training? But I think Jim would back me up. His tenet ‘Be specific – like a GPS location’ is a punchier way of saying the same thing. If you don’t know what success looks like, how will you know when you’ve reached it? If your goal is vague you may well plot a route to the wrong place.

2. Use a mix of quantitative and qualitative data to measure on-the-job performance (level three) and bottom line results (level four).

Take your measurements where they count: an increase in repeat business or reductions in customer complaints, perhaps. You’re looking for honest data, so be aware of the limitations and potential bias hidden in the numbers. Speaking to people to get the ’buzz’ can help you better interpret statistics. Ideally, draw on a wide range of measurements to get the true picture.

3. Support on-the-job application – orchestrate what happens beyond the course.

The application of learning doesn’t happen in a vacuum. Policies, processes, technical environments, management directives, budgets, timescales and many other factors all influence how well we perform and how far that performance translates into results. Collaborate with other areas of your business to identify and reduce barriers to performance. Make learning a crucial part of your organisation’s success. From within your L&D team you can offer performance support via job aids, coaching, ‘how to’ videos and the like.

Personal highlight

My personal highlight was the concept of ‘leading indicators’. Don’t wait until the end to measure final results – use short term observations to let you know if you’re heading in the right direction. If you’re not seeing improvements, tweak your training offering or tackle obstacles to performance.

This sounds obvious – but it’s not something I’d directly considered before and I’d wager that it rarely happens in practice. So, don’t wait for a year or more to see results. As Jim says, “Celebrate what is working and fix what isn’t!”

The idea of celebrating success is easy to overlook in favour of addressing problem areas. Yet if we can ‘find the bright lights’ it helps us to clearly see what’s working and share best practice. Psychologists such as BJ Fogg note how the principle of social cohesion – in blunt terms, we want to be like each other – can act as a motivational force for change. In this light, a positive example can be more powerful than a warning about a potential issue.

Greencore case study

The session ended with a success story. A case study presented by Paul Aggett, the Learning and Development Manager at Greencore – the people who make sandwiches for M&S. He used the Kirkpatrick model to help him identify what worked and what didn’t, and cut any training course which didn’t make an impact. A leadership course which survived the restructure was placed in the centre of a programme of events for the learners. This began with an interview with their line manager and ended (12 weeks later) with a full presentation to the leadership team on how they had implemented their new knowledge and what the results were. This scheduled presentation acted as a motivator for performance improvement, provided a platform for recognition and a forum for sharing best practice.

If you would like advice for your organisation’s learning strategy, please contact LEO to speak to an experienced consultant.

We use cookies to give you the best website experience possible, and by browsing our website you consent to this use. Non-essential cookies are currently blocked, but certain functionality on this website won't work without them. For full site access, please accept these cookies below. To reset your cookie settings, please see our privacy and cookie policy page.