Can any ol’ content be taught using any ol’ technology?
This seems to be the perception as the use of various learning technologies becomes commonplace in training departments.
But, is it really true? For example:
- Do you believe that a two-day project management program can be delivered as a four-hour virtual training class?
- Can a four-hour sales training session that is usually held face-to-face be delivered in a dozen 20-minute self-paced e-learning chunks?
There are several fundamental problems with this perception:
- We are looking at entire programs (i.e. project management or sales training) and attempting to force the entire program into one delivery modality, like a virtual classroom or e-learning.
- There’s an implied assumption that all delivery modalities treat all types of content in the same way.
So what’s the solution?
From an instructional design perspective we should be looking to develop much more of a blended program than trying to fit all content into a convenient delivery modality. At its essence, blended learning is not only about matching content to the most appropriate delivery medium, but doing it at the learning objective level. Instead of making a design decision to teach project management via WebEx Training Center, we break project management into its component learning objectives and match each learning objective to the best technology available.
The best approach I have found to accomplish this is to use a new take on Bloom’s Taxonomy. Originally developed in the 1950’s, the intent of Bloom’s Taxonomy was to categorize types of learning objectives to define a level of mastery in a classroom.
Using Bloom’s Taxonomy, depending on the desired outcome, you would categorize your learning objectives into one of the six levels of learning and then use appropriate activities that correspond to the levels of learning in order to achieve the desired level of mastery. For example, at the original knowledge level of learning, a student can recall knowledge by performing such activities as creating a list or creating a list of defining features.
This really works.
In fact, even those of us who have become instructional designers without prior training in the field can create very effective programs by using this simple, yet powerful, framework.
As content delivery moved out of the traditional classroom and into more collaborative learning technologies, Bloom’s Taxonomy was badly in need of some reconstructive surgery. In 2009, Andrew Churches repackaged the taxonomy to take advantage of tools ("edtech" or "educational technology") that can help us master different levels of learning in ways that were not previously possible.
What follows is a high-level summary of each of the six levels of learning contained in Churches’ digital taxonomy.The content is provided by by http://edorigami.wikispaces.com/Bloom's+Digital+Taxonomy, and I encourage you to explore the free resources and job aids there to help take your virtual learning to the next level.
Retrieving, recalling or recognizing knowledge from memory. Remembering is when memory is used to produce definitions, facts or lists, or recite or retrieve material.
Remembering is the level of learning where we become familiar enough with concepts that we can recognize when they are being used in another context. When we deal with the Remembering domain, we find ourselves using the tools available for self-directed learning. Web technologies like Google can help us to define terms. We can create an Articulate Storyline module that helps us to list important steps in a sequence. We can use books, PDF documents, and other web tools to read and then recall key concepts. Generally, we don’t need to collaborate with other people to remember concepts. Since Remembering doesn’t require collaboration, and testing to ensure Remembering has taken place can occur in a self-paced format, learning objectives that use keywords like Recognize, List, Identify, Define, and Locate can be delivered in a self-paced format.
A very conventional and inexpensive way to deliver knowledge-based content is via a virtual classroom webinar. We’ve all attended them (or pretended to attend them). A hundred people or more log-on to the same virtual session at the same time, listen to what experts have to say, and, if permitted, ask questions where appropriate to verify understanding. The Remembering level of learning seemingly aligns itself well with these large scale webinars. (I say ‘seemingly’ because we are not conducting any sort of assessment to ascertain if content is actually being retained. With most webinars, we ‘hope’ that people will log in, pay attention, and remember what is said. “Hope” is probably not an effective measurement technique.)
But we need to remember that this type of online session is not what we would traditionally define as “training.” This is simply information dissemination. It’s important. It’s useful. But it rarely gets us beyond knowledge, or perhaps bridging into the next level of learning.
Testing at this level of learning would most likely be very objective. Assessment answers can be easily identified as correct or incorrect. Feedback on test responses is directive and instructional in nature.
a) Learning objective: Identify common “slip and fall” areas on a college campus.
b) Assessment activity: Provide a campus map and have learners follow the map to the top five “slip and fall” areas.
c) Potential delivery technologies: A self-paced e-learning module which allows learners to interact with a campus map.
Constructing meaning from different types of function be they written or graphic.
The Understanding level of learning occurs when the learner can not only recall knowledge, but can explain it in context to someone else.
Paradoxically, the word “understanding” is one we typically try to avoid using when talking about learning objectives. The argument is that it is difficult to test for Understanding. How do we know what Understanding looks like? While we should not use the word as one of our learning objectives, we can use it to define this level of learning. The Understanding level of learning is taking what we recall and making that data meaningful. For instance, we take the definition of project management, and apply it to a project manager job description.
When we have short, stand-alone e-learning modules that can be taken on demand, we may be in the realm of fostering Understanding. We are moving beyond mere recall, and into connecting pieces of new knowledge together. Once again self-paced formats are often more appropriate that live delivery mediums.
There is probably a higher level of discussion or structured thought in achieving Understanding than was present in getting to Remembering. It’s not just a lot of data, the data suddenly becomes useful.
Understanding is not characterized by practicing a new skill or attempting to change behavior. It is a foundational understanding of key concepts that can be not only called upon but actually used later.
a) Learning objective: Based on seasonal weather conditions, anticipate specific “slip and fall” hazards that are unique to a campus.
b) Assessment activity: Learners will photograph five “slip and fall” hazards on their campus, and create a short presentation with the intent of informing the safety committee of these tests.
c) Potential delivery technologies: A discussion board where learners can get more information about the topic, post their individual presentations, and review the presentations posted by others.
What comes first?
There is some debate as to whether Bloom’s Taxonomy of Learning needs to occur in a linear fashion. For example, mastering the Applying level needs to occur before you can master the Evaluation level.
I consider Remembering and Understanding as foundational levels of learning and believe that they do need to come first. However, the next four domains (Applying, Analyzing, Evaluating, and Creating) may occur in any order, and we may not require an individual level of learning for a particular curriculum.
Carrying out or using a procedure through executing or implementing. Applying is related to and refers to situations where learned material is used through products like models, presentations, interviews and simulations.
The Applying level of learning takes us beyond foundational information and into the realm of training. Learners are starting to practice tasks, apply new skills, and correct mistakes. They can execute a checklist, create a table in Microsoft Word, enter data into a claims management system, or collaborate on a file in SharePoint.
Note that the verbs we are using are very action oriented, and can only be tested by the learner actually doing something. Generally, as we move into the Applying level of learning, we are starting to consider adding more collaborative activities into our learning plans. Activities such as discussions about how to apply key concepts, getting feedback on a presentation created, or working in breakout rooms to prioritize budget items support Applying.
Some start to see the social aspects of learning incorporated as we move out of Understanding and into Applying. While learning objectives in the Remembering and Understanding domains may have been delivered in a self-paced format or in a webinar format where interaction with others was limited and learners were expected to assimilate knowledge on their own, moving into Applying would often require live interaction.
a) Learning objective: After identifying “slip and fall” hazards on your college campus, propose preemptive safety fixes to minimize the risk to students.
b) Assessment activity: Learners will create a proposal that identifies the hazards, lists the cause of each hazard, and provides suggestions on how to mitigate the hazard risk.
c) Potential delivery technologies: Learners will participate in a virtual classroom discussion to learn how to identify and mitigate potential hazards and how to create effective arguments that support that mitigation.
Breaking material or concepts into parts, determining how the parts relate or interrelate to one another or to an overall structure or purpose. Mental actions include differentiating, organizing and attributing as well as being able to distinguish between components.
If the Applying level allows us to take new concepts and use them in a collaborative format, the Analyzing level starts to help us make cognitive decisions. Instead of just creating a budget and prioritizing budget items, Analyzing allows us to make decisions based on the data contained in that budget. We aren’t just prioritizing items; we conduct and provide the analysis behind the decision-making.
I’m sure you can see how this requires an activity that is more facilitated than, for example, Understanding objectives. Discussion boards, virtual classrooms, and live classrooms are often used for analysis.
Other learning technologies that can support analysis include simulations. For example, the military uses aircraft simulators to help pilots learn to make reliable decisions during combat situations. Similarly, we can use simulations to analyze data to determine whether or not to bring a drug to market. While simulations are not necessarily facilitated, the impact of not “passing the test” is significant. Learners can fail. The pilot can crash a plane. The pharmaceutical marketing trainee may bring a drug to market before it is ready. Or the financial trainee may create a budget that doesn’t meet the needs of the department.
Creating and delivering training and assessments that meet the Analyzing level of learning take more time, more resources and more quality control. If your objective is, “Navigate a jet fighter in combat situations,” then learners will need to practice those skills to successfully master that objective. Without the practice component, they will remain in the Remembering and Understanding levels at best.
a) Learning objective: Decide which hazard mitigation is most appropriate for a particular situation.
b) Assessment activity: Learners will compare three “slip and fall” mitigation solutions and conduct a cost-benefit analysis in order to determine the best solution.
c) Potential delivery technologies: The virtual classroom combined with videos and job aids provide background information on the various mitigation solutions.
Do we blame technology or design?
This is where virtual program designs start to fall apart.
We desire the outcome of the program to be at a high level. For example, we want learners to not only create a budget; we want then to be able to analyze the impact of that budget on the department. However, another common requirement is that the program be short in duration. These two outcomes are often mutually exclusive.
The end result is that the same content that took a full day in a face-to-face classroom is now delivered in less than half that time in the virtual classroom. How? Well, the practice opportunities, the collaboration, and the assessments were all removed. We meet the content requirements by filling the slides with words. However, by removing all of those pieces we are staying in the Remembering and Understanding levels of learning and therefore are not providing the structure necessary to facilitate the other levels of learning.
And then, when this doesn’t work, we look at it as a failure in technology as opposed to a failure in design and implementation.
Making judgments based on criteria and standards through checking and critiquing.
Evaluating is a means of making a decision. A decision can be made individually or collaboratively as part of a group. Yes, ultimately, the learner can be making these decisions individually, in which case coaching or social media tools like discussion boards can be used very effectively. If, however, the learner will eventually be collaborating on a decision as part of a group, this learning objective should be taught in a more collaborative format.
Evaluating is not about providing information to make a decision, but actually making that decision. The outcome is not the presentation of facts, but interpreting facts and applying them to make a judgment. (The process of a trial by jury is often used as an example of an evaluation activity.)
To get to the Evaluating level obviously we need understanding of basic concepts and we need to be able to review and analyze facts. But do we need to actually create the presentation to analyze the facts? Probably not.
The objectives in leadership curricula often fall into the Evaluating level of learning. Leaders need to provide feedback based on facts and on the impact of particular actions on the organization as a whole. Managers need to make recommendations regarding promotions, team leadership roles, and raises.
a) Learning objective: Determine the best vendor to mitigate identified “slip and fall” hazards.
b) Assessment activity: Learners will research vendors and then make a recommendation to the safety committee based on their research.
c) Potential delivery technologies: The virtual classroom combined with videos and job aids provide background information on the various vendors. Web searches, scavenger hunts, and referral checking will supplement the more structured training.
Putting the elements together to form a coherent or functional whole; reorganizing elements into a new pattern or structure through generating, planning or producing.
With all the tools that are readily available to learners today, the Creating level of learning can be a lot of fun. Learners can create videos, wikis, podcasts, and a variety of other ‘projects’ using low-cost technologies organizations often have already available.
An argument made by many practitioners is that learning should start with Creating. For instance, allow people to create a presentation to explain their current understanding and their current point of view on a particular topic, and then use the other levels of learning to build on their ideas, identify misunderstandings, and really make sure the learning crystallizes.
The idea behind Creating is building something new, not regurgitating what was taught in class. This makes sense.
As we know, the classroom is a controlled environment. When learners go out into the real world and apply what they’ve learned, the situations aren’t going to be as clean as the simulations or case studies we provided to them in the classroom. Allowing them to create something new that applies to personal situations is an exciting capstone to any curriculum.
a) Learning objective: Design a hazard mitigation plan for the new student center on campus.
b) Assessment activity: Create a hazard mitigation plan that includes budget, design, vendor recommendations, and evaluation protocols.
c) Potential delivery technologies: The virtual classroom combined with videos and job aids provide background information. Web searches, scavenger hunts, and referral checking will supplement the more structured training.
Everything old is new again
When I first discovered Bloom’s Taxonomy (totally by accident, when trying to find a way to explain constructing learning objectives to a class participant), the simplicity of the model appealed to me. But I knew something was missing – the applicability to current learning technologies and industry trends.
The new Digital Taxonomy is not only transformational, in that it incorporates collaboration and new learning methods into its construction, but accessible. It is relatively easy to construct examples based on a particular curriculum to illustrate the need for a blend of technologies instead of creating a ‘one size fits all’ scenario, and allows us to advocate for the best technological fit for our content.
It’s a foundational tool that should be in all of our toolboxes.
Learn more about creating the best blend with our Blended Learning Design Certificate course. Read about the course and how you can earn your Blended Learning Design Badge by clicking on the graphic below.