Become a Learning and Development Measurement Detective
Discovering the impact of training and learning on employee performance and business goals can feel like a mystery but it doesn’t have to be. Here’s how to solve impact mysteries like an L&D detective.
Kevin M. Yates, known in the global learning and development community as the L&D detective, believes that “Impact is when learning activates performance and a business goal.”
Why is that important?
Impact investigation reveals the extent to which training and learning fulfill the highest purpose. We fulfill our highest purpose when we measurably influence performance, behavior, actions, and business goals. Measuring learning impact also informs decisions for how to shape content and delivery so employees not only grow professionally but also contribute to the success of the entire company.
Should you measure the impact of all learning?
Kevin does not believe all learning programs should be measured for impact. In fact, he says that on average only 1-3% of learning and development (L&D) initiatives and programs created in most global companies are designed with purpose and intention for impacting employee performance and a business goal and are therefore not “eligible for impact investigation”.
“The reality is that there are some training and learning programs designed to impact employee performance and business goals and some are not. We want to measure impact for training and learning solutions designed with specific targets for performance and business goals… So, how do you know which programs were designed with intention and purpose for impact? “I follow six impact standards that guide the decision for when a training or learning solution has potential for impact versus those that do not,” says Kevin.
The six Impact Standards© Kevin relies on for his investigations include:
- Priority – does the program have visibility or sponsorship with senior leaders?
- Position – is the program aligned to a business goal?
- Purpose – are there specific targets for performance outcomes?
- Pinpoints – do measures, KPIs, and data for the program exist?
- Power – does the program support a comprehensive learning strategy?
- Payoff – is there a significant price tag or investment of time for development?
Where does a learning measurement investigation start?
If you try to measure impact after the training or learning solution is designed, deployed, and consumed, discovering results will be difficult or maybe even impossible, says Kevin. If you start after the program has launched, Kevin says you’re not going to have what you need to successfully answer the question, “What is the impact of learning?”.
“A common struggle I see in most learning organizations is that they haven’t planned for measurement in the beginning and that makes it difficult to measure in the end. If you are proactive and plan for impact, however, it will be much easier to measure,” Kevin notes.
He suggests you set up the program for measurable success and to do that he recommends using questions from his Impact Opportunity Interview©:
- Core – What is the business goal?
- Condition – What is the opportunity or problem to solve?
- Contribution – Who is supporting the goal?
- Community – Whose performance is needed?
- Capability – What are the performance requirements?
- Comparison- What is the performance difference?
- Causation – What are the performance drivers?
- Calculation – What are the measures and KPIs?
- Caution – What are the risks?
Kevin says when you get the answers to these questions upfront before any conversations about “training”, you know what to design, how to design, and how to deliver a program that will measurably impact performance and business goals.
Measuring the impact of training and learning may not be easy but it is absolutely possible, says Kevin. Getting the answers to these questions upfront is what makes it possible. Using the answers to inform decisions for design is what makes learning impactful.
What do you absolutely need to measure learning impact?
You need facts, evidence, and data to show the impact of your learning programs—and there are three categories that are critical for showing results:
- Learning performance – take a look at how technology and the learning experience drive results. Can you attribute the learning experience to a change in behavior and actions? (This is where Inkling helps your investigation a lot.)
- People performance – examine how people are behaving and acting in their roles. Are they performing in a way that you can trace it back to the learning experience? (This is how learners consume content and make learning actionable.)
- Business performance – review the key performance indicators including sales, customer satisfaction rates, quality, errors, time, etc. What is the data that shows actual business performance? (This is where the rubber meets the road).
Then comes the detective work: connecting the dots between these three categories. What do the facts, evidence, and data show? Of course, there may be a lot of different data points like Inkling, LRS, LMS, and other systems but you’re looking for trends and a story on how your learning program points to business impact.
Kevin recommends one way to look at measurement is to examine the learning habits of high performers. What content are they consuming, how are they consuming learning content, and what content do they use most? Then compare these facts, evidence, and data to other employees to see what’s different.