In learning and development, as in every area of business, it isn’t simply enough to produce results. You also need to measure those results to prove you’re beneficial to your organization and get recognition (and support) for your efforts. There’s no doubt you need numbers, but are the numbers you’re focused on actually the right ones?
So asks Gary Hegenbart of eLearning Development News. A long-time L&D pro, Hegenbart warns against equating data that tracks how much training has been delivered with data that measures how much your training program has accomplished.
Completed training does not always equal learning, Hegenbart points out, citing an example from current events: “Look at how effective the Secret Service ethics training was prior to the debacle in Colombia. Everyone completed training, but it was clearly not effective,” he notes.
But What About Assessment?
You might object that with well-designed training, “completed training” and “learning” are in fact one and the same. In an ideal world, you’d be right, Hegenbart concedes. But he notes that few of us work in an ideal world:
“Yes, we can design training courses to measure learning through activities and assessments (not just quizzes, real assessment). In those cases, the tracking metrics do provide value. If the course has rigorous assessment, then your completion stat is an indicator of learning. But how many courses have you developed that truly assess learning? How many have you taken online or attended? Instructional designers are often between a rock and hard place – there is an expectation from management (or customers) that people complete training, so we are under pressure to ensure they do.
So what’s the takeaway here? Hegenbart offers a few related points: One, as a training pro, it’s your job to accurately convey to the rest of the organization what your training numbers really mean. In other words, make sure the bosses know that the stats on completed training are just that, and not a solemn pledge that everyone has actually absorbed the material. “Instructional designers and course developers need to start at the beginning and ensure that the people asking for reports on training success understand what the data means,” Hegenbart writes.
But being honest about your numbers isn’t the only step you should take. Resist pressure from above to shepherd everyone briskly through a training program, no matter whether they’ve learned or not. Instead, provide rigorous-enough gauges of learning that some people can actually fail, or at least ask for more training, until they’ve mastered the material. “We can’t be afraid to let people fail the assessment,” he concludes.
All of which adds up to a simple rule of thumb: the fact that everyone completed the training is usually a pretty poor measure of training success, so don’t go mistaking tracking data for learning.
Have you worked for organizations that have been guilty of producing less than completely honest numbers to track training success?
London-based Jessica Stillman blogs about generational issues and trends in the workforce for Inc.com, GigaOM and Brazen Careerist.
Image via Official U.S. Navy Imagery.
More Online Training Resources
what is a learning management system | online web training | online course management system | online finance training | benefits of online learning | create tests online | food service training | learning management system software | point of sale online