Over the past several years I have noticed an increase in roles responsible for the evaluation and analysis of learning programs. This change reflects how learning professionals focus on data more now than ever. L&D job titles like Learning Analytics Director, Learning Measurement and Evaluation Analyst, or Learning Impact Director have become commonplace.
I attribute the increase in these roles to companies recognizing a business need to quantify the value of learning programs. This is logical as the spend for learning in the U.S. in the past year— including payroll and spending on external products and services—was $87.6 billion!
Unfortunately, as businesses look for answers, learning professionals, especially people in these specific roles, struggle to show learning impact. They struggle because they have limited information on learning outcomes. When you are responsible for showing learning results, the lack of relevant data to analyze true learning impact leads to a lot of frustration.
When Learning Professionals Focus on Data, Where Do They Look?
In my work, I often meet people in learning measurement roles. When I ask them what types of information they use to show learning impact, I usually get the same answer. They tell me their current learning data includes:
- Survey data (how students felt about the program)
- LMS data on the number of courses and participants
- Changes to metrics in the organization
When these learning professionals focus on data derived from these sources, they look for clues to show impact. They crunch survey data (or hire statisticians to do it) and then attempt to interpret it to tell a learning story. They try to make the case that if people liked the courses, employees took the courses, and the business metrics moved in the right direction, learning was successful.
GIGO: What Happens When Learning Professionals Focus on Data—But the Wrong Kind
Can you see what is wrong with this picture? Leaders ask for information on learning results and instead are given information on learning activity. From opinion and activity data, we report on presumed relationships between organizational success and learning.
Perhaps you recall the phrase coined during the earliest days of computing: garbage in, garbage out (often abbreviated as GIGO). In essence, it means that starting with bad or irrelevant data leads to bad or irrelevant conclusions. This aptly describes what’s happening here. We collect the wrong information and then manipulate it to attempt to answer the questions we are asked.
If learning professionals want to participate in business strategy discussions and in conversations where business decisions are made, we need data that informs how learning shapes business outcomes. We need to understand which of our methods enable employees to learn and how those methods lead to business results.
What Data Do We Really Need?
As businesses focus more on data, Learning Professionals need better information on the outcomes of our practice. We need information on specific capabilities employee gained from learning programs. When it comes to improving performance, providing intelligence on employee outcomes (enhanced knowledge, critical thinking skills, and behaviors) is the only learning data that matters. If we identify and track these outcomes, we determine the effectiveness of our learning methods and internal decisions so we can improve our practice. Only when we understand the way learning produces impact, can we inform others on the best learning strategy to achieve business goals.
When we report to stakeholders we need information that shows:
- Learning’s alignment with organization needs: Are we are working on the right things?
- Program performance outcomes: Are we are doing our job well? Does our instructional methods/design lead to learning or improved employee capability?
- Learning application: Do business units support learning? Can we verify on-the-job application of learned behavior?
- Learning’s influence on metrics: Do targeted metrics change due to learning?
Getting a Seat at the Table
Activity is not evidence of impact and opinions are not fact. So why do we waste time putting lipstick on a pig? Why use data that so clearly misses the mark? Seriously folks, if we want to be viewed as leaders, we must do better than that. If we want a seat at the table, then we must bring more to the table.
We must understand which learning methods work, and which don’t. We must be confident enough to create transparency in our work, establish internal quality controls, and make improvements based on what we learn. When we understand our means to improve employee capability, we can direct change and show ways we can contribute to organizational success.
Is your role tasked with measuring the impact of learning? Are you relying on the wrong data types—survey responses, LMS activity reports—to describe that impact? If you need a better understanding of how to create measurable learning that shows impact, please contact us here at eParamus. Our mission is to teach L&D professionals how to correctly design programs that create measurable learning.
Please follow eParamus on LinkedIn and feel free to connect with me, Laura Paramoure, PhD to discuss your specific learning challenges.
Photo copyright: goodluz / 123RF Stock Photo
Enter your information information below to subscribe to our blog
[gravityform id=”14″ title=”true” description=”false” ajax=”false”]