Building Bridges

Get to the Head of the Class in Writing Learning Assessments

A college midterm for Introduction to Psychology. The Certified Public Accountant (CPA) exam. The MCAS (the Massachusetts Comprehensive Assessment System – if you’re a parent to a school-aged child, you know it!)

What do these three items have in common? Yes, we commonly refer to them as “tests” or “exams,” but like the MCAS acronym suggests, they are assessments, designed to determine how well learners have mastered given material. Whether the student learns the material as an accountant or while still in grade school, assessment questions must connect to the learning objectives and reflect an appropriate level of difficulty.

With these tenets in mind, I co-led a training session on Writing Learning Assessments to 40 training professionals at a large pharmaceutical client site back in September. I was excited to deliver the information to this experienced group, several of whom joined the session virtually.

We built the class around the learners defining best practices. First and foremost, it is crucial to ensure that the assessment questions are directly tied to learning objectives, to see how well the employees meet these goals. The learners will master the material better if they apply their knowledge in a realistic scenario, rather than merely parrot back what they have learned. Also, I advised the instructional designers to have someone “test their test” to evaluate the assessment’s timing, difficulty level, clarity of questions, and more. I also recommended considering the test-taking audience; sometimes a question may not apply to all learners.

Predictable answers can doom an assessment. From my experience, I have noticed that many correct answers are often either the most detailed choice, very obvious, or the infamous “all of the above.” Test writers should review their assessments for these types of answers. A high score on a test needs to reflect how well the learner masters the material, not how well he or she strategizes the assessment experience.

Some of the training professionals I met that day knew these tips. However, I shared one best practice which threw many of them a curve: During which stage of ADDIE (Assess / Design / Develop / Implement / Evaluate) should an instructional designer begin creating test questions? The answer is below! **Hint: instructional design professionals need to start thinking of assessment questions early, as this drives course content as well as provides ample time to review and test the questions.

This group of training professionals also needed to pay special attention to the difficulty level of questions, since they work in the pharmaceutical industry. We learned that the FDA includes training in their audits of life sciences companies, and that they are increasingly assessing the difficulty level of test questions to ensure that the employees truly master the material. At the same time, test writers must write questions that learners can comprehend, so this process is a balancing act.

I led several activities to solidify the instructional designers’ knowledge. After reading sample questions, teams rewrote them to increase clarity and effectiveness. They also wrote original questions based on a case study provided by the client. Finally, we directed the test writers to determine an action plan that they could implement during their workday.

What did I observe from leading this workshop? When I read many of the groups’ test questions, I was surprised how many asked learners to repeat facts, rather than apply knowledge. This omission reminded me of learning math in school; while it is important to know basic math facts, you truly master a math concept when you solve a word problem. Math facts don’t test a learner’s true ability as much as his/her memory. And for memory assistance at work, of course, there are job aids. The most competent workers will be those who can navigate a real-life challenge quickly and easily, illustrating their proficiency. Walking out of the workshop, the trainers understood this better than they had coming in.


** Answer: “Design.” However, if you answered “Assess” you are off to a good start, and should exercise caution in writing the questions until you have an approved set of learning objectives. The majority of the test writers in the workshop selected “Develop.”