The Erosion of Professional CompetencyTo compound the issue, many people entered the learning profession from other disciplines. Subject matter experts (SMEs)—people who were proficient in a certain content—were asked to move into the training department. Because they knew a topic well, it was the expectation that they could teach others what they needed to know. When SMEs take on the role of trainer (without an education degree or experience in instructional design), they naturally think content knowledge is the most important skill. They do not see the need to understand adult learning principles, learning styles, or instructional methods since those were not requirements for them to get the job. The combination of measuring success by opinion, and training roles not requiring any expertise in learning design or facilitation, eroded the competency of learning professionals. The learning professional’s role was not associated with a skill to create effective learning programs. They only needed the ability to convey information. Since there was no measurement of actual behavior change, the learning program’s value was dictated by participant opinions of event. Learning management systems (LMS) further eroded the profession. Using an LMS to manage training shifts focus away from competency by focusing instead on learning activities. Learning departments would report on the number of courses produced or the number of course attendees. Learning professionals were judged by how many employees they served. Budgets were determined by the volume of activity instead of the quality of the activity.
An Uneven Playing FieldName any other business department judged on activity rather than results. Could you imagine a sales department justifying their budget by the number of sales calls made or number of leads generated? If those activities don’t lead to closed sales, then they don’t matter. What if the accounting department tried to justify their existence by the number of spreadsheets they created? Would they assert their value to the organization based on if others felt they made good spreadsheets? No. The business holds them accountable for the quality of their data and the accuracy of their spreadsheets. When learning departments measure their success by number of activities and happiness of survey respondents, they do themselves a disservice. This shows that the learning profession cannot meet the same standards that other business units hold themselves to. The learning team remains unable to verify their results. They cannot show the effectiveness of their methods. They fail to prove their value to the organization. When we measure ourselves by asking others if we are doing our job well (through surveys) we reveal our inability to measure ourselves. When we cannot show the effectiveness of our methods (learning program design/facilitation) or quantify our results (behavior change, performance improvement), we erode our standing in the organization. If we cannot show stakeholders the effectiveness of our methods and what improvements the learning department made to the business, then we lose our seat at the table. We become only a tool for others to use, or to discard if they see no use.
Choosing the Right Measurement MethodWe must measure learning, but use the right method for doing so. We need to learn from the damage done by using surveys and an LMS to measure our value. The method we use to measure learning has implications.
- Use methods that create transparency and quantify how effective the learning practice is
- Align measurement to expected behavior outcomes
- Align learning design to expected outcomes, so you can then diagnose and repair programs when things do not go as planned
- Choose methods that empirically measure when a behavior change improves performance
- Distinguish measurement between things that the learning department can control and those that it cannot
Photo copyright: nexusplexus / 123RF Stock Photo
Enter your information information below to subscribe to our blog