There are different points of view on how assessment in microlearning actually plays out. There are some variables to consider when sorting out what learners actually gain from these nuggets of information.
The good news is that microlearning is assessable. In fact, there's even research to prove it.
Microlearning Is Working
Lenny DeFranco, in his article, Why Microlearning Drives Over 20% More Information Retention Than Long-Form Training, shares findings of a 2015 research study highlighting the link between microlearning (chunked in three different sizes), and corresponding assessment questions. The results: microlearning has macro impact on learning.
Reports Franco: "Smaller slices of content were better. Not just for helping the participants retain the information, either, but to do it more efficiently, as well." And, because questions came sooner (rather than at the end of longer learning segments), learners could answer them with greater accuracy, in part because they could also predict the type of question to be asked.
What was behind this success? Most notable is that learner interaction with shorter bits of content seemed to make information easier to decipher. Coupled with instant feedback, overall retention was heightened.
The Assessment/Microlearning Mix
Before doing anything on the assessment front, keep in mind that it should always be linked to desired outcomes.
Like Mark Griffiths, client partner for Newleaf Training and Development, says, what to assess in microlearning—and how to assess it—depends on what you want people to know and/or do. You might test for knowledge through a quiz. For specific skills, you might ask for "proof" that they have been applied, e.g., a photograph of the learner assembling a part or a recording of a greeting a hotel employee uses to greet a customer. These items can then be uploaded to an LRS or LMS for review by a line manager.
Vince Flango, project manager/principal instructional designer of General Dynamics Information Technology agrees, noting that measuring the single objective (via a short test, for example) accompanying a micro unit is the best way to measure for proficiency. He adds that when a student successfully demonstrates knowledge or skill, credit is awarded. Just completing the course without this feedback or affirmation has limited learning value.
Microlearning can also be assessed directly in ROI. Here's a possible scenario: a field technician uses an app to view a one-minute video about a machine part prior to touching it. The ROI would be what percentage of first time fixes were successful when first using the learning component? Or what percentages were successful when the tech did not use the learning component?
The bite-sized nuggets can even be the assessments that check for understanding of and increase learner engagement with content. These measures might be a quiz, a brief assignment, and/or games that invite the learner, in a fun way, to put skills or knowledge into play. Games of this type are ripe for scoring that make the playing a bit more competitive.
Nick Padley, director of training for CD2™ learning, CD2™ describes a virtual soccer game CD2 developed (yes, for a soccer organization), in which learners select the right approach for addressing a customer service issue. Get the right method, goal scored. Get the wrong one, the goalie blocks. Oh, did I mention that the learner plays against a virtual opponent? The learner plays until he or she makes all the right choices. The game results can be scored!
Ratings, like game scores, along with other ratings, grades, etc., can be logged in an LMS. Metrics can measure learner usage rates and time spent on a task. Other standard forms of assessment, like post-training surveys and Q & A about what was learned and how it will be applied are also valid measurement tools.
The Three Ms: Mobilizing Microlearning Measurement
I like this Three Ms concept. Think about what you want learners to gain from your microlearning units; develop short but concise corresponding assessment tools, and then measure in the ways you see matter.
Right now, the "field" of microlearning assessment is pretty open. There's lots of room for creativity and experimentation. What you design could contribute to an evidence-based repository of quality assessment tools.
- Three Considerations to Help Maximize Microlearning Initiatives. Along with strategies for creating and delivering microlearning, this piece provides insight on the value of designing new and authentic assessments of microlearning experiences.
- How Do Students Respond to Microlearning? This brief abstract of a research study demonstrates the impact of a university-based microlearning initiative that heightened student understanding of statistical concepts.