This Is How You Verify Your Learning Program Impact

learning professionals talking about learning program impact

learning professionals talking about learning program impactPart of the mystery surrounding learning measurement has to do with where in the process to measure learning. Verification of learning program impact can be made at several checkpoints. Let’s review them as we examine the various phases of learning.

The need for learning is identified during the Examining phase. In this phase, the current state of an organization is observed. If the current state reveals a training need then a learning program is created and delivered in the Learning phase. Once the Learning phase is complete, students return to their jobs to enter the Applying phase. Finally, after learning has been applied, evidence of the impact of learning to the organization can be seen in the Impacting phase.

Learning Program Impact Phases And Where Changes Occur During Each

A competency exam that targets performance standards provides a benchmark to use to verify success at both the Learning and Applying phases.

A competency exam can verify decisions about learning and learning impact as it moves through the performance process. Since learning program impact takes different forms (as learning is acquired and then applied in the organization), you can use the competency exam to inform yourself on success or failure at different checkpoints. The exam will show results from each of the specific objectives targeted by the program.

Different checkpoints in the performance process verify learning program impact. For example, an exam given before a learning event (a pre-exam) verifies there is a need for the learning to happen. If the pre-exam shows that a standard of performance already exists (most people know the material and can do the skills) you can conclude that the decision made during the Examining phase to conduct learning to improve performance was faulty. With high pre-exam scores, you’ll know there is no need to run the learning program. It would be money spent to train employees on what they already know. You then know to review your analysis of the performance gap and identify other reasons, besides learning, that could cause the poor performance.

If the pre-exam indicates that learning is needed, the next checkpoint is a post exam. Post exam results identify which training objectives were learned during the program.  An individual’s exam will show competency for the student, however, the class average on the post exam verifies classroom learning and tests instructional design effectiveness and program facilitation.

Compared to the pre-exam, the post exam should show growth gained in the learning program. Fortunately, performance standard exams created from Learning Objects reveal the causes of failure. Objectives that were successful during learning confirm that the learning methods (lecture, small group discussion, exercises, etc.) worked.

Pre and post exam results verify design decisions, methods, and processes (the design and the facilitation) of the learning program. The pre and post exams are the same exam and compare each student’s capability before and after a learning program. The comparison verifies that capability was created through the learning program.

If a post exam shows growth at an acceptable level, then it shows that the student’s capability improved and verifies success in the learning process.

After learning, students return to their job with the expectation that they will use their new  knowledge, skill, or attitude at work. A transfer exam verifies if learning is applied on the job. The transfer exam is the same exam as the pre and post exams because we want to verify the same capability across the performance process.

The transfer exam shows if the capability gained from the learning program is retained and applied. At this point, we know learning was successful (because of the average post exam results) and are testing to see if the job environment allowed application. In other words, the transfer exam tests if the application of new skills is supported on the job. Do the requirements on the job reinforce use of the new skills? Does the job environment provide ample time to incorporate the new learning? Does the manager reinforce, through words and actions, the use of the new skills? Do reward systems on the job reinforce using the new skills?

What Do These Checkpoints Reveal?

There are many things that impede the use of new capabilities in organizations. After a successful post exam, if the transfer exam shows poor results, that provides clear evidence that things beyond learning are obstructing the use of the new skills. The issues discovered in this phase of the process usually center on culture, manager support, and/or job conditions.

Over the years we have found many things you would not expect during the Applying phase of learning. Most surprising are things such as direct instruction from managers to not use the learning. Uncovering barriers to application in the Applying phase is a fundamental requirement if learning programs are to be useful in achieving performance targets.

Failure to transfer new learning on the job happens because of things that training cannot directly impact (only influence). When application issues are discovered, the conversation shifts to the causes of poor performance. Conversations change from a learning issue to a job environment issue, allowing business professionals to redirect repairs toward appropriate areas.

If exam results show that training programs created new knowledge, skills, and attitudes in the Learning phase and also show the new abilities are applied, we can conclude that the learning program was designed well and the organization supports the learning.

The final check point verifies that changes in behavior made a difference in organizational indicators (metrics). Changes in behavior are only seen if the Applying phase was successful. If the newly gained knowledge, skills, or attitudes are not applied, there will not be a change in the organization.

If a key performance metric was targeted, and learning program objectives were designed to address that metric, you can compare pre- and post-measurements of the metric to show if the behavior change influenced the metric. A key performance metric is one that reflects performance (a behavior) and is usually found in the business unit that is receiving the learning. Since learning changes behavior, learning programs will only influence metrics that reflect performance.

How Do You Demystify Performance Issues?

Learning programs that target knowledge, skill, and attitude objectives can succeed and programs with clear Learning Objects provide a way to measure success. Programs designed with these components empower all stakeholders by creating transparency on expected outcomes and clarifying the causes for failure.

The absence of viable learning measurement creates a cloud of mystery around the value of learning. Without measurement of outcomes many learning professionals confuse providing content with the practice of providing learning.

Unfortunately many training professionals have moved away from these best practices in instructional design and have little means to show the value of their programs. Targeting programs to increase knowledge, skills, and attitudes on the job is the only way to create behavior change in the organization. Identifying a clear unit of measure for learning success that includes the learning focus and methods is the only way to produce credible evidence of learning outcomes.

Please follow eParamus on LinkedIn and feel free to connect with me, Laura Paramoure, PhD to tell me more about your training challenges. This is the final in a three-part series about solving the mystery of learning program results. (Click here to read part one and part two.) If your mission is to deliver learning programs that create change, then please contact us to learn more.

Please follow eParamus on LinkedIn and feel free to connect with me, Laura Paramoure, PhD to tell me more about your training challenges.

Photo copyright: langstrup / 123RF Stock Photo

 

Enter your information information below to subscribe to our blog

[gravityform id=”14″ title=”true” description=”false” ajax=”false”]