Virtually There Session Recap
“Did the training work?”
If your stakeholders asked you this question, would you have an answer supported by proof, or would you feel anxious about demonstrating the value of your work?
We know that data can clarify the worth our learning events provide to participants and on the business’ bottom line. But the process of collecting, analyzing, and using data feels involved, complex, and perhaps sometimes more than it’s worth.
Analytics expert Scott Weersing to the rescue! During his recent Virtually There session, Scott shared the basics of learning analytics and how to use them to show your work makes an impact. (Watch the full replay here.)
Helpfully, Scott provided a succinct baseline of learning analytics and evaluation, saying it is:
“The practice to prove and improve the impact of learning programs on business outcomes.”
Using this definition demystifies the overarching mission of training and the reason evaluation matters so tremendously. According to Scott, “We’re trying to prove the learning works and improve it the next time for ongoing success.”
Common forms of evaluation include surveys and tests. Unlike the tests we took in school, as learning professionals we should think of evaluation as aligning learning to larger business objectives. Most importantly, evaluation does not only include the collection of data. It involves the interpretation of the data we collect, and using it to make choices about future iterations of our programs to enhance learner success.
Knowing what evaluation means in a modern learning context offers a strong starting point for incorporating it into your L&D approach. But stakeholders reasonably want to know why we want to take this next step, especially if the current way of doing things works just fine.
When making the business case to move beyond training implementation alone, consider Scott’s comprehensive list of reasons to evaluate:
- “Demonstrate the value of programs to stakeholders
- Deliver on what is promised and needed (creating accountability in learning)
- Align solutions to company goals
- Quickly see the impact of different approaches (making it agile)
- Focus on results rather than activity
- Increase data-driven approaches
- Prioritize and align resources
- Win awards
- Drive employee engagement”
Remember: not all of these reasons will resonate with your decision makers. Build your argument around the main motivations and priorities for your organization for a greater chance of buy-in.
For many of us, the theory behind evaluation and learning analytics makes sense. We do not doubt that measurement matters. But the how proves stressful. With so many tools, technologies, and processes available, picking a direction turns from task to project.
Contrary to popular belief, though, evaluation does not require excessive complexity. In fact, Scott pointed out the process can be quite simple:
- Start with alignment. “Align with stakeholders and determine what success is and what they want from the learning program.”
- Determine goals. “This step helps you move from focusing on alignment and working with stakeholders to better understand what they expect will happen at the end of the program.”
- Determine metrics. “Do your stakeholders want improved sales, reduced attrition rates, increased engagement scores? Your metrics help you understand what information to collect.”
- Select data collection method and tools. “Explore tools available at your organization, consider what checklists, tests, etc. you may have to create if you’re measuring a new type of skill.”
- Collect data. “Add the collection methods to your training programs so you can gather learner performance information based on the defined goals!”
- Analyze data. “Using the data you collected, and the goals you set at the beginning of the process, look at what the information is telling you. Does anything surprise you?”
- Share results and make changes. “Share the outcomes with stakeholders, compile the analysis into a narrative, define where the program will go moving forward.”
Using the three Ws Scott covered in his Virtually There session, Learning Analytics: How to Show Your Work is Making an Impact, you can begin the process of evaluating your programs and improving them in the future.