Business people in group happy

Businesses spend billions on training every year. According to a 2016 training report, total U.S. training expenditures reached $70.65 billion. Despite all this money spent, learning professionals—and the companies that employ them—remain mystified about how to show measurable training results from these programs.

Millions Spent Without Proving Efficacy

This inability frustrates both sides. The solution requires our profession to learn how to design programs that connect learning objectives directly to learning impact.

It’s important to note that businesses spend money on what matters to them. Employee training is a business imperative and commands this level of spending because it is a main means to ensure high human performance. In fact, when addressing employee performance, training is usually the go-to solution.

Why, then, do we understand so little about how training creates results?

For more than 15 years I have struggled to understand this wide disconnect. We hold the conviction that learning is the answer to performance issues, yet lack evidence of how it is the answer.

It boggles the mind to consider how little is understood about the advantage of learning in an organization. Even learning professionals struggle to connect the value of learning to business success.

Clearly performance improvement is a business imperative but where is the evidence that learning actually improves performance? What confirmation do we have that money spent to design, create, and implement learning programs is worth it?

The Training Results We Are Measuring Don’t Tell Us Much

To be fair, there is some indication that training is beneficial.

We have mountains of data that tell us if students attended training, liked the training, or think it was worth their time. We gain some clue about training’s impact in surveys that collect business stakeholder opinion on the value of learning programs.

eLearning technology also gathers data on how much students interact with learning content or complete the activities in a learning program. Awareness of student activities during learning show signs of student engagement with programs.

Stakeholder opinion surveys tell us how stakeholders feel about learning programs and student activity data shows student actions related a learning event. However, neither provides credible evidence of the significance of learning or results from training programs. The data we currently collect does not tell us what we need to know to improve our learning or to ensure our learning addresses business goals.

When something is a business imperative, can we afford to accept ambivalent, anecdotal evidence? Will businesses continue to spend billions of dollars for such scant proof?

Let’s accept that training programs can make the difference in human performance and are a catalyst to business success. If so, what part does learning play? What is it that learning programs provide? How can we show evidence that they are working?

Measuring Performance Change Is The Answer

Instead of gathering opinions in surveys or data points on student activities, we need to understand how learning results in performance change. We need to understand the learning methods and processes that make up the learning results. Only by understanding the decisions made, the methods used, and the processes followed will we understand what works and what does not work when creating learning outcomes.

Organizations will not continue to spend billions on learning programs if their impact cannot be measured. Understanding instructional design solves the mystery of learning measurement and value. The method of design provides learning professionals with a means to show cause and effect in learning decisions and create transparency in outcomes for all stakeholders. The challenge now is to embrace the craft of instructional design, hold ourselves accountable for the decisions we make, and share our results as we target and design programs to improve performance.

At eParamus, we’re dedicated to removing the mystery of training results. I’ll be following this thread of thought in upcoming posts. I hope to answer your key questions surrounding this topic. If you need to learn more, reach out to us here at eParamus. Contact us for more information.

Please follow eParamus on LinkedIn and feel free to connect with me, Laura Paramoure, PhD to tell me more about your training challenges.

Photo copyright: dotshock / 123RF Stock Photo


Enter your information information below to subscribe to our blog

[gravityform id=”14″ title=”true” description=”false” ajax=”false”]

Talk to a Training Evaluation Professional

Thank you for your interest in eParamus. We look forward to helping you meet your design and measurement goals.

Contact Us - Global