Only 3 Data Points Really Matter For Learning Professionals, Part 2

data points matter to stakeholders

data points matter to stakeholdersIn my last post, I noted there are lots of data points we measure in our learning programs. Yet most fail to answer the questions that our stakeholders want answered. This week, we’ll dive deeper into that and figure out how to change it.

Your senior leaders want 3 questions answered. Do your learning programs answer them?

  1. Do learning programs target organization needs?
  2. Are the learning programs effective for adding competencies and developing employees?
  3. Are the skills gained in learning programs being applied and improving job performance?
To answer, you must know more than the subject of the learning, you need to know what skills (competencies) your training addresses. Additionally, you need to have a standard method of practice that leads to consistent results.

Why Standardized Instructional Design Matters

I’ve spoken before about standardized instructional design and how it provides many benefits. The prime benefit is pinpointing skill targets in a learning program. With these identified, it becomes possible to measure their accomplishment. Good design enables good measurement.

Skills targeted, gained, and applied. Imagine being able to report all these data points. Then imagine linking the organizational change to those improved skills.

Standardized design includes measurable objectives and assessments. When organizations have these in place, measurement becomes easy. This creates transparency and makes answering these three questions commonplace.

Consider the following customer data:

Performance Chart

This one graph provides senior leadership the answer to most critical questions. The data shows results across time for:

  • Scrap – learning content delivered that is already known
  • Competency – level of employee competency
  • Application – organizational acceptance/application of the newly gained skills

The targets for the learning professional are to:

  • Reduce scrapsave time and money by removing content learners already know
  • Increase competency – add to employee capability
  • Increase applicationalign and collaborate with the organization to ensure on-the-job skill adoption

Program Assessments Provide the Essential Data Points

The data in this graph comes from program assessments linked to performance objectives. These serve as benchmarks across the learning process. As you measure certain points in the process, you see the state of the organization. You can pinpoint the changes created through learning.

The data does not come from survey response opinions. It comes from actual assessments of knowledge and skill gains.

Using assessments that measure knowledge and skills provides feedback to all stakeholders. These measures focus the organization on the goals of the learning. It enables clear and transparent results without complicated statistical analysis.

This graph shows an on-boarding course for one of our customers. It reflects what we often see when we first start with clients. The initial data (Jan 2014) shows that scrap learning comprised over 50% of program content. This means that over 50% of their effort is wasted!

Competency gains from their program started almost at 80%. That high percentage shows that the learning design/methods they used were effective. Later data showed it remained effective over the years.

The most interesting thing about the beginning data shown is the application line. Although they made some learning gains immediately, most gains were lost during application. This shows misalignment and poor collaboration between the learning team and the business unit.

Over time, with data gleaned from feedback, they reduced scrap to content to roughly 30%. This verified improvement in targeting skills that were needed by the organization. By the fall of 2015, on-the-job skills application reached and ultimately surpassed what was learned. This shows significantly improved collaboration and alignment between learning and the business units.

Efficient and Effective: The Sweet Spot For Learning Programs

This data collection reveals both the efficiency and effectiveness of the learning department. The data will answer the questions senior leaders are asking. Using data showing learning program success is then linked to organizational metric change. In this way, you link organization change directly to skills gained in your programs.

Before starting, identify metrics like productivity, quality, or revenue that are influenced by skill improvement. This helps to focus program designers on the end goal. Tracking the results from these data points explains the metric changes.

Using data to run a business is the new normal. To lead data-driven decisions, learning pros must focus on data that shows the effectiveness of the learning function. Our job is to find performance gaps, fill them, and ensure their application. Senior leaders do not care about all the activities we have participated in or generated. They do not care who likes the learning department. They care if we use our budgets wisely, develop employee competency, and improve business performance. Let’s give them what they want.

Can your learning team provide these three data points to your business stakeholders? What design skills does your team need to master to do so? If you want to learn the how to create measurable learning, then contact us here at eParamus. We’d love to help.

Please follow eParamus on LinkedIn and feel free to connect with me, Laura Paramoure, PhD to discuss the learning challenges you face.

Photo copyright: pressmaster / 123RF Stock Photo

Enter your information information below to subscribe to our blog

[gravityform id=”14″ title=”true” description=”false” ajax=”false”]

Talk to a Training Evaluation Professional

Thank you for your interest in eParamus. We look forward to helping you meet your design and measurement goals.

Contact Us - Global