Would an experienced runner set off from a marathon starting line without warming up? No! A world class athlete takes time to prepare for a race, and often goes through the paces of a regime created by her running coach. She may run a short distance, then hold 30-second stretches. Her coach knows these energizing moves will get her muscles, and mind, prepared to run 26.2 miles.
Similarly, a trainer or facilitator (the “running coach”) starts a training session with an energizer or icebreaker, an opening activity designed to limber up the learners’ “mental muscles” and help them prepare to learn.
What are some effective ways to motivate learners? EnVision team members share their favorite icebreakers below.
For Ginny Maglio, who delivers leadership curricula to EnVision’s clients, it’s all about sparking learner interest. One of Maglio’s favorite icebreakers involves none other than – LEGO® blocks. “Participants build something related to the course content,” Maglio says. “For example, in a program on how to build successful teams, participants are asked to build something that depicts their organization’s mission….short, simple engaging activities can be used very successfully to initiate learner interest.” When learners feel energetic, they’re also engaged in the course.
Irene Stern Frielich, EnVision’s president, “breaks the ice” for learners in a different way. During learner introductions, she asks participants to share a fun thing they did over the weekend. This recollection, and the positive social vibes from sharing it, get learners in the frame of mind to think and process.
Scientific research supports this tactic. In the article How Emotions Influence Learning and Memory Processes in the Brain, Dr. Shlomo Wagner found that “Different emotions cause the brain to work differently, in terms of cognitive processes such as learning and memory.” In his study, Dr. Wagner discovered that pleasant social encounters between rats caused the rodents to build memories, whereas negative emotions did not. It makes sense that when people, too, recall happy times, it puts them in a good mindset for learning.
Paula Spizziri, an instructional designer with EnVision, ensures she starts a module with an icebreaker that takes a page from the playbook of the running coach. Spizziri asks the learners to engage in a brief spurt of physical activity, since exercise stimulates the brain. Or, she may prompt the learners in an exercise in which they take turns completing the sentence “I’m late because…” with a movie plot. The other learners then try to guess each film, and both actions spark the learners’ thinking.
It’s clear that icebreakers that target both learners’ emotions and energy work. According to an article by Michael Higley on elearningindustry.com, “Appropriately framing a lesson with an icebreaker activity is a useful technique in establishing context in which new learning will take place,” Higley writes. “The initial experiences students have with any course establish the tone for future tasks.” So, a course’s eventual success can be traced, in part, to an icebreaker’s efficacy — just like stretching sets the stage for a great race.
Bea Smart, an instructional designer, started working recently with Joe King, a compliance director, about potential training needs. She’s excited about the opportunity to partner with the Compliance Department and wants to impress her new internal client.
Bea starts off with a training needs analysis by completing several employee interviews, an activity that had never been conducted in the Compliance Department.
The interviews strongly suggest that changes other than training are needed to meet business goals, including clearly written standard operating procedures. However, Bea’s focus is instructional design, and she’s not sure Joe would agree with the recommendations that arise from the interviews, so she doesn’t share the interview data or her conclusions with him.
After agreeing on learning objectives and an elearning modality for a course Joe would like developed, Bea agrees to deliver the draft 1 storyboard to Joe in five days. Though she needs input from several stakeholders and believes a complete storyboard will probably take closer to ten days to create, she wants to get this project done ahead of schedule, establish clout with the Compliance Department, and hopefully get more work from Joe’s team later.
After two weeks go by, Bea has not received the feedback she needs to finish the draft 1 storyboard. Joe, under pressure from his manager, is ready to send out the PowerPoint slides he created for a recent presentation on the topic, and use that as “training.” Bea asks Joe to hold off and immediately gets in touch with her colleagues, works a few late nights, and manages to finish the storyboard after three more days.
The following day, Bea meets with Joe to discuss his feedback on the storyboard. Joe has made many content changes and Bea believes they no longer address the approved learning objectives. His changes also impact effective learning techniques and result in more of a “page turner” course – a course that is a reading exercise with no learner engagement. Bea remembers the friction over the missed storyboard deadline, though, and decides to keep quiet, figuring Joe knows best. And even if he doesn’t, she doesn’t want to create more friction.
The training module finally pilots, then launches. Three months later, analysis finds that 70% of learners don’t complete the course. In addition, there is no improvement in business performance and the same errors continue to be made. Where did Bea go wrong in working with her client, and what skills did she miss?
- Being Transparent. When Bea performed her needs assessment, she uncovered information via interviews and drew conclusions she didn’t share with Joe. She was concerned he’d disagree, potentially imperiling their relationship.But by sharing this information, and offering her expert judgment—which is, after all, likely what Joe expects—she could have helped Joe realize that SOPs needed updating in order to support successful training outcomes.
- Setting Expectations. Bea wanted to impress Joe by offering a fast, even unrealistic, turnaround time. It would have been wiser, though, to give the standard ten days a storyboard like this usually takes Bea—and then delight the client if it is completed sooner.In addition, Bea was expecting information from the stakeholders, but had she (or Joe) communicated requirements to them? And if so and they were late, she should check in with them and alert Joe immediately. Joe can then decide on the best way to proceed.
- Communicating Fully and Clearly. Bea doesn’t push back or inquire about Joe’s significant changes. While he may have very good reasons for changing his mind (perhaps a recent internal audit turned up some significant new issues), making unstated assumptions is never a good option! Asking questions to understand Joe’s reasoning—and then offering her own professional judgment—would help clarify assumptions and result in an appropriate learning solution.
Partnering with clients—internal or external—can be a delicate balancing act. It is only natural to try to keep the client happy to further your professional relationship. Yet, if you’re withholding information, offering unrealistic expectations, or communicating incompletely, the relationship will flounder. Be transparent, set expectations, and communicate fully and clearly, to help your client relationships—whether internal or external—flourish.
Looking for a fun night out? You could head off to your favorite neighborhood dive – or, er, brew pub – and grab a burger and a couple of beers. You’d see the same regulars and the familiar, weathered interior of the pub. And you’d enjoy the comforting routine…though in the back of your mind, you may be a bit bored.
But if you want to remember the night? Try a pub crawl. You’d meet up with a group of friends. Together, you’d go from bar to bar – the Irish pub, the “townie” spot, the hipster place you’ve been hearing about – and spend a bit of time in each, trying different craft beers and meeting new “regulars.” You’d spend less time in each bar than you would have at your neighborhood haunt, but you’d have a full night. At the end, you’d likely reminisce about the experience and what you found unique about each pub.
People often choose a pub crawl for a special occasion – it’s more fun than camping out in one bar. Why?
You get a new bar visit – or “snippet” experience – each time you change venues. At one bar, you’ll kick back to some acoustic live music; at another, sample a small batch house brew; and at a third, stand crammed shoulder-to-shoulder with your fellow bar patrons, making sure your feet don’t stick to the beer-covered floor.
Each of these pub stops offers something different that will stand out in your memory, much more than two hours at your favorite dive, as comforting as that may be. By choosing and changing up your environment, each snippet experience becomes more special — and memorable.
How does this relate to learning? If you follow the formula of learning followed by a quiz and repeat until you’re done, your learners’ eyes will start to glaze over – and they won’t even be tipsy. Create a well-planned blend of short, tightly-focused elearning lessons, videos, games, and quizzes supported by in-person classroom modules and targeted action learning assignments to practice skills on the job? That’s a “pub crawl” your learners can get on board with.
A learning experience comprised of short snippets is also known as microlearning. Microlearning succeeds, in part, because learners are able to seek out specific, focused content when they need it. “Microlearning…enables learners to access information in a variety of formats that fits their learning style. When employees seek out content, it helps them better retain what they have learned and stokes their appetite to learn more.”(Fox, Amy. (2016). Microlearning for Effective Performance. TD Magazine)
Whether you’re creating learning solutions or pub crawling, don’t schedule too many snippets or “stops” – it could prove tricky for the participants to remember the experience. Choose a pub that offers brews your friends like — maybe a wide variety on tap, for example (or a game for a learning activity). And you may want to plan out which local watering holes to avoid – maybe your best friend doesn’t get along with the regulars, or your learners don’t respond well to games. Plan thoughtfully!
So, the lesson is, whether pub crawling or creating a learning solution, mix it up with appropriate, small bursts of new material and offer choices to the learners. You’ll make effective use of their time — and create a memorable experience.
Are you – or is someone you work closely with – a SME (subject matter expert)? Instructional designers often require a SME’s knowledge and input to develop training. And it can be a challenge to obtain the key content (and only the key content).
First, let’s introduce a typical SME, Simone. Simone is the director of compliance for the organization and comes to this role with 13 years’ experience. She is well-regarded in the organization and is often the go-to person when difficult compliance questions arise. Over the years, the compliance team has put into place a series of standard operating procedures, tools, and best practices that help the organization stay in compliance so they can continue selling their products. Simone was instrumental in creating these documents and approaches, and really loves her work. She loves it so much she tries to share her knowledge with any willing (or not so willing!) listener.
Are you working with a SME, and feeling the pain, wondering how such a fabulous compliance director can also be such a challenge to work with? Let’s explore the symptoms you might experience, and provide some prevention strategies.
Bleeding content: Too much material
Simone’s slides – for management presentations and for training – are filled with text and there are few images or diagrams. Her slides include a lot of details that are important to Simone. Much of this content, however, isn’t applicable to the target audience. When she meets with the instructional designer or presents to a group, she reads off the slides and adds even more details – whether or not they apply. After all, she loves her content! Everyone should want to know all she knows.
Unclear diagnosis: Missing content and next steps
Often there are disconnects. It’s not clear how Simone got from point A to point B. And sometimes it’s not clear to the listeners what they need to be able to do with the information. Are they responsible for completing the tasks? For ensuring the tasks have been completed? In what situations? How often? What is the impact of not doing it or not doing it correctly? Simone means very well, but she takes the content and connections for granted. She already knows how it all ties together and she’s been working with it for years so she assumes everyone else understands the connections, too.
Constant emergencies and “setbacks”: Lacks time to support the instructional designer
Simone is, of course, a very busy person. She didn’t plan to spend hours answering questions the instructional designer is posing to her. She doesn’t have time to review a design document, a storyboard, and an elearning course or instructor notes, slides, and handouts for a classroom course. Sometimes she misses meetings, comes poorly prepared, or even misses deadlines you agreed upon.
There are many prevention options, usually a combination of which are most effective. Here are three of my favorites.
Control the content: Stay focused on the goal and how the learner needs to perform
Typically, I start with a needs analysis looking at the business goals and performance required to achieve those goals. The learning objectives, then, are focused on supporting the required performance and I obtain stakeholder agreement on these objectives.
Having laid the groundwork, I can keep these goals and objectives front and center. When Simone asks me to add more content or explains content she already provided me, I’ll ask questions like:
- Is this critical information needed for the learner to be able to [insert learning objective here]?
- Or is the information “nice to have”? If so, I suggest we eliminate it so the learner isn’t overloaded with non-critical information.
- Or is the information really helpful but not critical? Then, I might suggest we include it as an optional “tip” or in a resource list.
Another set of follow-up questions I like is: If you were the target learner, what would you need to be able to do with this information? Why is that so important? What if I don’t know/do that?
I might end up sounding like a broken record, but I’ll keep asking these same questions. Eventually Simone will catch on and start self-editing.
Create a logical, connected sequence: Simplify content and make it accessible
When I am handed an existing slideshow the SME would like to use, and I see it is filled with lots of words, one approach I use is to diagram what the words mean. For example, if I am developing training about a process, I show it in a simplified flowchart and break it down (using more flowcharting) throughout the course. This both helps anchor learners in where they are in the process and helps them see interrelationships.
I also like to use scenarios, with visual representations of the characters and what they are doing. Telling a story via the scenario helps learners feel immersed in the learning. You can do this as an alternative to blocks of classroom lecture or pages of text in an elearning course.
I always check in with Simone to confirm the diagram or scenarios convey the key points accurately. After doing this for a bit, eventually Simone will hopefully appreciate how I’m bringing her content to life and the value I provide as an instructional designer.
Set expectations early and often
Have you ever brought your car in for service and they tell you it will take 3 hours and cost no more than $300? Then, six hours later – no car yet and they decided it will be $1500? Of course a good repair shop will set expectations with you up front and update you so there are no last-minute surprises.
Imagine Simone’s situation. She needs to know how much of her time will be needed and when so she can plan appropriately. If you underestimate or don’t offer any indication, she can’t easily ensure she’ll have the time needed. So, I like to let the SME know early on how much time is needed and when the “heavy” weeks will be. Then, as the project morphs and the timelines change, I keep her updated and check in on her ability to meet the new deadlines before setting them in stone.
Another helpful tip is to plan meetings in advance. Having a weekly or biweekly meeting on the calendar for check-ins and working time is a way to guarantee a minimum of time together. Simone will appreciate having an agenda in advance and an action item list after the meeting. If you don’t need a meeting, you can always cancel. I never hear complaints about a canceled meeting. And remain flexible so the SME stays engaged.
Finally, acknowledge your SME is very busy and may have true emergencies to tend to. When I know that might be an ongoing concern, I ask Simone to provide an alternate SME and ensure that Simone will have opportunities to review materials. This can protect her time and help keep the project moving along.
So, what’s the prognosis for SME syndrome? If you implement your prevention plan from the start – staying organized and communicating openly with the SME – you greatly increase your ability to develop a training program that meets the needs of the learner and the organization.
In our last blog post we saw how immersive, sensory events can help us remember experiences, using the example of my trip to the Blue Lagoon spa. We don’t need to visit a spa to help learners remember, though…we can create immersive experiences to enhance our instructional design. Here are two examples of physically immersive activities from our library.
The Human Flowchart
Training people in a specific process often involves use of a flowchart, but review of this tool in a classroom can be a dull exercise. We can bring the training to life—and make it more effective—with a “human flowchart.”
We used immersion in this way when we designed a class for a medical device company on FDA complaint handling regulations. The client developed a process to meet the FDA requirements and created a flowchart with a series of 20 steps to be executed by seven distinct roles.
EnVision immersed the learners in the training using movement. We broke up the learners into seven groups, each representing a different role and gave each group signs, listing their part in the complaint handling process.
Getting up from their desks, the learners then needed to move into a flowchart according to their part in the complaint process. Once everyone was in the right spot, the instructor started in the role of a customer calling in a complaint. Each group then shared what action they planned to take and what information would move on to the next group. The movement and interaction with others kept the learners involved in the entire process of complaint handling. Back at work, they should be more likely to recall the details of their role’s steps based on the physical locations of each learner.
For another client, this one a global pharma organization, we designed a course for new managers. One of the learning objectives was for learners “to be able to describe how managers’ actions impact the engagement of their team members.” To achieve this, we created an activity in which “managers” set and communicated performance objectives around building widgets, in this case paper origami cranes, with “employees” who were to make them. During different rounds, managers could give no feedback or some feedback and coaching. Debrief followed with a focus on the value of feedback and coaching to engage employees.
Picture a classroom filled with learners eagerly (or not so eagerly without coaching!) folding paper cranes to achieve their goals. The feel of the crinkled paper, the sound of their managers coaching them (or the dead air when they didn’t!)…these involved the senses and helped learners become fully immersed in the activity. The experience became more than “knowledge,” but an experience to remember and be able to describe concretely.
Creating Your Own Activities to Engage the Senses
Some of the best learning activities allow the learners to immerse in the learning activity, much as tourists immerse in the spa baths at the Blue Lagoon.
So, for your next course, try finding a way to immerse learners even more deeply in the learning by involving multiple senses. It can be a lot of fun to create activities like these, and it requires imagination, creativity, and lots of planning. Want your own activity to engage the senses? Schedule a call for a free 1-hour consultation on how to make that happen.
This summer, my husband and I traveled to Iceland. Among other wonders of nature, we visited the Blue Lagoon, a spa formed 40 years ago during the creation of a geothermal power plant. Today, people come from all over to bathe in the warm, soothing water.
There are a surprising number of things to do at the Blue Lagoon. You can laze underneath a man-made waterfall, order a drink from the bar, or even enjoy a massage in the water. We wandered across the bridges over the pool and explored the area. We even used the silica clay as a mud mask to soften our skin and further the sense of relaxation.
Even with all the amenities, I was spellbound first and foremost by the setting. The water, turned a beautiful blue-green by naturally occurring algae, had a temperature of 100 degrees Fahrenheit – about 45 degrees warmer than the air. This contrast between water and air made the warm pools feel even more comfortable and welcoming.
With nearly 20 hours of sunlight in July, it didn’t feel strange to be soaking outside at nearly any time of day. In the daylight, I was able to appreciate the setting clearly. Lava fields topped with moss surround the pool. When I closed my eyes, I heard the gentle cascade of the waterfall. With everything I felt, saw and heard, the Blue Lagoon was an immersion for the senses.
Research shows that these strong sensory impressions could help me remember my visit: a specific sight or sound can bring back a particular memory, and there’s a very good reason for it. “The same part of our brain that’s in charge of processing our senses is also responsible, at least in part, for storing emotional memories,” writes Rachel Reittner on livescience.com.
This storage receptacle is known as the sensory cortex. Benedetto Sachetti, a researcher in Italy, conducted an experiment with rats in which he trained the rodents to associate a specific sound with an electric shock. The rats would freeze upon hearing the sound, but after the scientists created lesions on the rats’ sensory auditory cortexes, they froze less frequently. This suggested the rats correlated the sound with the electric shock, and could no longer do so after their lesions (Sacco & Sachetti (2010). Role of Secondary Sensory Cortices in Emotional Memory Storage. Science, 329 (5992), pp. 649-656.)
The practice of mindfulness can include focusing one’s attention on the senses, and there is evidentiary support that it, too, helps memory. In another experiment led by Michael Mrazek, a graduate student at UC Santa Barbara, undergraduates were directed to take a class in either mindfulness or nutrition for two weeks. “The students in the mindfulness group were taught how to pay attention to their sensory processes, like tasting food and breathing,” writes Alice G. Walton in an article describing the experiment.
The students’ recall was tested both by working memory tests and a verbal GRE (graduate admissions) exercise. After one week of instruction, the mindfulness group improved more than the nutrition group in both assessments.
The takeaway? The senses have a big role in memory retention. If possible, try to incorporate a strong sensory component into your instructional design. Oh, and the trip to the Blue Lagoon was amazing….just ask me in 10 years. I’ll remember.
For case studies see our post on Using Multiple Senses to Improve Learning.
We’re all enjoying the Rio Olympics right now, and witnessing incredible athletic feats. Some sports achievements are very easy to measure. Take track and field, for example. In a race, the runner who crosses the finish line first wins. In soccer or water polo, the team that scores the most points wins the game.
In other Olympic sports, however, the winner may not be so nearly clear-cut. In gymnastics, an athlete prepares a program to the best of her (or his) ability. The judges score her on many elements, and the highest score among the athletes determines the winner. The gymnast will not know immediately following her program whether or not she won, as her competitors follow her.
Similarly, instructional designers don’t face an immediate, clear-cut path to knowing if their training “stuck” — and if the learners are able to apply what they’ve learned to their jobs. However, with evaluations, instructional designers can learn a great deal about their training programs’ success.
The Kirkpatrick evaluation levels assess learner reaction (level 1), learning (level 2), application of skills to the job (level 3), and business results (level 4). Challenges may arise with executing each evaluation level. Let’s look at a few of the challenges in levels 1-3 and see how instructional designers might mitigate them.
Measuring learner reaction, level 1, can be done by a course evaluation form, show of learners’ hands, verbal check-ins, or via a “dashboard” that measures the pace of the course and learner energy levels. The trainer notes learner reactions during the course and can use these to amend the course both “on the fly” and after. Clients may, however, read too much into level 1 results and want to change the learning solution without further assessment. It is important to explain to clients exactly what level 1 assessments measure — solely learner reaction — and to use other evaluation tools as well.
Level 2 evaluations, or measurements of learning, frequently take the form of final written tests or simulations. However, writing assessment questions can often be a challenge in itself. If the questions are too difficult for learners to understand, this could impact the validity of the test. Similarly, it can be hard to write appropriate “distractor” questions that are not obviously incorrect. After all, some people pass a learning assessment with flying colors — without ever taking the class or having done the job for which they are training.
The other issue for instructional designers is that there is not necessarily a direct correlation between level 2 “test” success and on-the-job achievement. Extensive knowledge may not lead to professional efficacy. To test for skills, ask assessment questions that explore the best way to execute certain job tasks or make on-the-job decisions, rather than questions seeking rote knowledge.
Level 3 evaluations measure application of skills to the job, often via learner or manager surveys, interviews with learners or managers, or on-the-job observation. An important consideration when measuring at level 3 is to ensure appropriate resources and tools — coaching, performance support, etc. — are in place to reinforce learning and support behavioral change.
The “success case method” created by Robert Brinkerhoff also may reveal successful on-the-job behavioral changes. With this method, stakeholders interview a sample of learners — typically high performers — to determine which factors have made them successful — and then corroborate this success with independent evidence. Because the strong learners share which influences contributed to their success, their interview responses can yield clues to help the less successful learners.
The success/case method can be very effective, giving stakeholders qualitative feedback beyond data, says Paula Spizziri, an EnVision consultant. “Sometimes you don’t know what’s behind the curtain…this gives you that,” she said. However, an organization using the success/case method needs to allocate time and resources for the interviews and independent data collection.
While all evaluation methods have some challenges, many of these hurdles can be mitigated. Because learning success, like that of a gymnastics program, may not be immediate, the evaluation process strengthens your training program.
I happen to be a big music lover, and enjoy both listening and performing as a flutist in the Sharon Concert Band. At last week’s band rehearsal our conductor, Steve, was doing his usual great job of keeping us on tempo, signaling when each section needs to come in, and gesturing to show we should play more piano (Steve often needs to remind us to play more softly). As I was playing, it occurred to me that a symphonic performance resembles the process of training.
In both, people assume distinct roles to accomplish the goal — whether it be performing the piece or executing the training program. And both require a specific process, anchored by practice, to reach those goals.
A symphony (or other musical piece) begins in the mind of a composer, who creates the music to be played. After learning the piece of music, the conductor rehearses the musicians in preparation of performing the symphony.
An instructional designer, similar to a composer, authors the learning solution. A trainer (“conductor”) then takes over, using the leader’s guide (music “score”) and teaches the learners (“musicians”) the material they need to know. A trainer may use visual aids or workbooks to guide the learner, just like a conductor uses a baton to lead musicians and keep the beat.
A symphony may feature a few soloists throughout the piece. A soloist shines in an orchestral performance, much like an active participant stands out in the classroom. A section leader in each instrument group organizes the section, much like a facilitator may help “conduct” a breakout group in classroom instruction.
While musicians and learners “rehearse” all together, each “musician” must put in individual practice time to learn the material. During a learning session, the trainer may break the learners into groups, where each group can work on a directed activity and then share the results with the entire classroom. Likewise, a conductor rehearses one group of musicians at a time, such as the flutes, to work on a specific part. Afterward, the entire orchestra can play together.
A successful symphonic performance depends both on the musicians rehearsing together (as in classroom instruction) and each player practicing (or learning) on his or her own. Facilitation by an experienced conductor, or trainer, brings out the best in the musicians or learners. Each person has his or her “part” to play, and each role must be “performed” to the best of the performer’s abilities to achieve a winning final product.
Have you ever gone on a treasure hunt, or maybe seen one in the movies? The seeker searches for a treasure chest—usually attending to challenges along the way. Imagine large rolling rocks and a few poison darts, all difficult to control! Eventually the seeker locates the elusive treasure chest, but not all its contents are gleaming. There is some culling to do, and some polishing, before being ready to present the gems to the funders of the treasure hunt.
Back in the real world, learning & development professionals are given a mission to add value to the business, often through a request for “training.” On the L&D professionals’ “treasure search” they often attend to challenges along the way. Imagine the multiple competing priorities, regulatory controls, and insufficient resources that can impact successful learning.
Once the treasure is “found” (the learners have learned something!), what culling and polishing needs to be done so you can clearly see your treasure, before you present your findings to the funders and other stakeholders in your organization? Sometimes it can be challenging to find those precious gems and polish them so we can see how they glitter!
Earlier this month, I presented a workshop at the regional ATD conference. The topic, Polishing Your Gems, focused on creating an evaluation plan for a learning solution. Joined by client Matt Matosic from the Boston Public Health Commission (BPHC), we worked through a case study summarizing BPHC’s course on Hospital-Based Patient Decontamination and discussed options for evaluating its success. Participants then had an opportunity to begin developing their own evaluation plans, using an Evaluation and Measurement Planning Checklist we provided.
But where to start? Here are the core questions we suggested in our workshop to help folks determine which treasures to seek, as they begin their own evaluation planning.
Business issues questions
- Why is this training (or other intervention) being requested?
- What is driving the request or identification of this issue?
- How will solving this problem support the organization’s “business” goals?
- What do you need to be able to show as an outcome of your efforts?
- Who will see the results? (Consider: L&D management, business line management, senior management, learners themselves, instructional designer, instructor, funders, and other stakeholders)
- What will each stakeholder do with the information?
You might recognize that many of these questions get at Kirkpatrick’s level 4 (business impact).
Job performance questions
- What changes in job performance are needed to support meeting the business goals?
- What changes in job performance can reasonably be addressed via a learning solution?
- What workplace supports (such as managers, mentors, or peers; job aids or other performance support tools; periodic updates or web meetings) are realistically available to support the change in job performance?
These questions relate to Kirkpatrick’s level 3 (behavior change on the job).
Answers to these questions will help you define the learning objectives, set up workplace support systems to ensure learning continues and is reinforced on the job, and develop evaluation tools at all the Kirkpatrick levels.
For more information on our Evaluation and Measurement Planning Checklist, contact firstname.lastname@example.org.
Practicing a classroom course polishes and improves it. Typically, practice comes with a pilot, during which instructional designers and trainers iron out classroom kinks.
Two pilots (an abbreviated pre-pilot and a full pilot) were employed in a class entitled Operationalizing Emergency Plans: Incident Command in Action, on which EnVision consulted for a public agency. The example shows how a pilot helps the design process when it works well.
This four-hour, immersive simulation was designed for nurses, administrators, and emergency medical technicians in healthcare environments such as long-term care facilities, hospitals, community health centers, and public health commissions. During the simulation, the learners make decisions to help their organization transition from day-to-day operations to a true-to-life emergency situation; a fictitious, extreme heat wave hits the area with no relief at night, producing health and medical complications for the learners to handle.
The learners are divided into teams of various medical and healthcare agencies seated at separate tables, with each learner playing a specific Incident Command System role. The scenario begins three days before the heat wave hits and progresses from the early stages to mid- and late-stages when serious repercussions occur, including power loss, a medical surge, and mass fatalities. The simulation ends with action reporting and improvement planning.
In the class, each team determines which actions to take. EnVision’s client utilized facilitators from different community agencies to help shape discussion at each table. In addition, each table had “injects” sporadically arrive with information that could impact each table’s decisions.
“During the class, the learners explore what it takes for organizations to be ‘operationally ready’ for emergencies, and to experience the Incident Command System in action,” explained Marilyn Kobus, an EnVision team member who worked on the project. Kobus supported the client subject matter expert/course designer-developer by offering design consultation, providing project management services, and contributing to the course instructor guide.
Following the pilot, the learners shared positive feedback. First and foremost, they felt more prepared for a large-scale emergency. The immersive nature of the class enabled the learners to envision what professionals in the different job roles might actually be thinking during an emergency. One learner called the class “a complete revelation.”
In addition, the learners liked the choice of a heat wave as a learning scenario versus a more typical New England event, like a blizzard. Their main challenge was playing “catch-up” following the fast-paced first module.
The instructional designers and trainers learned also, especially in the pre-pilot. Delivered to students in the Public Health Program at Tufts University School of Medicine, the pre-pilot uncovered necessary changes in the course timing, content, and organization of materials. The team also decided to add pre-work for the pilot session.
From the design team’s perspective, the pilot largely hit the bull’s eye. The team planned for minor changes for the actual course, such as having a participant assume the role of team scribe and modifying the pre-work.
Kobus was pleased with the pilot’s success, and believed the design team’s focus on gathering input from community stakeholders to create the scenario played a big role in it. “I have not seen a pilot with so many moving parts run as smoothly as this one did. Despite the fact that there were multiple players, detailed scenarios for the simulation, and unique “injects”…despite that complexity, I thought it was an outstanding pilot. Learners were fully engaged from the start with high energy, and said they gained great skills and information to take back to the job.”