Using 'case-based' questions in assessment tasks

Dr Chris Deneen, Senior Lecturer in Higher Education Curriculum & Assessment, from the Melbourne Centre for the Study of Higher Education (MCSHE) shares some insights into using case-based questions in teaching and assessment.

Introduction

As we explore new ways to engage with our students, conducting effective assessment remains a key priority. We must adopt and promote approaches to assessment that elicit novel, authentic and thoughtful responses. This article focuses on one approach, case-based assessment (CaBA - note: CaBA is used as an acronym rather than CBA to prevent confusing case-based assessment with curriculum-based assessment).

What is case-based assessment?

Case-based assessment (CaBA) uses a scenario (i.e. the case) that the students familiarize themselves with and respond to. The assessed response can be almost anything ranging from a set of multiple-choice questions (MCQs) to an extended written/oral response. CaBA aligns with a broad range of assessment intentions and formats. CaBA can, for example be used on an examination to organize a set of MCQ items. On the other end of the spectrum, students might engage with a complex, lengthy case as part of a semester-long, problem-based learning project. The assessed response might involve developing and presenting a group-based, novel solution to that problem.

A common and essential element of CaBA is that the assessed responses are tethered to the case. Therefore, picking or developing a good case is essential to conducting effective CaBA. A few principles to start from:

The case should be:

  • Understandable to a student in that subject, at that level. The vocabulary, concepts, setting and other aspects of the case should not rely on knowledge or skills outside what you may reasonably expect of the students you are assessing.
  • Highly relevant to what you are assessing. Generally, this means there should be a strong alignment between the case and subject outcomes you are assessing. More specifically, the case should set up the student to be assessed on the specific knowledge and skills you are targeting.
  • Novel. The student should not have encountered the exact same case previously. Alternately, a previously encountered case can be used, but in a way that requires the students to respond in a novel way. This also means you should not be using cases that students could find through a Google search.
  • In ‘the Goldilocks zone.’ The case must contain enough information for a well-prepared student to successfully address the corresponding assessment, but not so much information that a less prepared student can guess their way to the answers. Remember, a fundamental purpose of assessment is to distinguish between those who know, and those who don’t.
  • As authentic as possible. CaBA presents us with an opportunity to introduce students to real situations they may encounter professionally and personally. By developing or picking cases with strong authenticity, we are helping our students make connections between what they learn in the classroom and how they will apply that in the rest of their lives.

Who uses case-based assessment?

CaBA can be used in any discipline, but it is strongly associated with disciplines that have a significant clinical and/or practicum component. CaBA is one way to introduce students to applying their subject and program knowledge in a workplace setting. Taken this way, CaBA presents a low-risk approach to help students make the jump from responding as a student to responding as a developing professional. It is therefore unsurprising that CaBA is a technique used frequently in the medical professions. CaBA is, however a very flexible approach that can be adapted to any discipline and many subjects within those disciplines. In the next section, we’ll look at three discipline-crossing considerations in CaBA and the rationales behind each.

How can case-based approaches work for me and my students?

Open-book and/or online exams

Increasingly, people are using open-book exams. This is for a few reasons including the constraints of teaching during COVID, increasing concerns over the efficacy of remote proctoring and a desire to move away from assessments that require basic memorisation as a hurdle requirement to success.

At the same time, people are rightly concerned that open-book exams create the opportunity for students to simply ‘look up’ the answers. This seems especially true when open-book exams are handled remotely. In-person, we have some control over the resources a student brings into an examination. When the student is taking an exam in their home, over the internet, it is difficult and often counterproductive to try and exercise this degree of control.

One of the principles for case development is novelty. That is, the scenario or what you ask the students to do with that scenario should not be something directly encountered in the subject or discoverable through an internet search. The novelty of CaBA allows the principle of ‘open book’ to operate as intended. That is, students may use materials to assist in coming to an original answer. That is a fundamentally different operation with deeper cognition than what happens when students simple look up an answer.

Using novel cases has the additional effect of decreasing the likelihood of a student recalling a memorized set of responses. Presenting an original case pushes the student to apply knowledge and skills in a novel scenario. In doing so, this has the additional, desirable effect of moving the student’s cognition away from surface level and towards exhibition of deeper thinking.

Thus, CaBA allows for novel use and synthesis of assessable knowledge and skills. This decreases likelihood of ‘textbook’ answers and may fortify the academic integrity of the exam.

Enhancing authenticity

A common concern or complaint around many assessments is that they often solicit knowledge in ways decontextualized from their application in ‘real world’ settings. In a perfect world, we might redesign all our assessment tasks for greater authenticity. A common characteristic of authentic assessment however, is that the way in which we assess the learner (i.e. the task) replicates a ‘real world’ situation. This can be a complex, time-consuming process. Challenges can emerge in the planning, execution, and marking. Therefore, it is not always practical to change the actual assessment task design to a more authentic modality. Sometimes, we have to use traditional assessment formats, like exams.

CaBA can help us get past some of the challenges in making our assessments more authentic. Having students respond to an authentic case allows us to enhance the authenticity of an assessment, even when we may not be able to revise the actual task design. CaBA allows you to place the student in a real-life/work scenario and requires them to apply subject knowledge/skills to the scenario. An added benefit to this approach is that it allows you to position the student in different ways relative to the scenario, eliciting different types of responses: e.g., analyze the scenario, solve it, offer an alternative, etc.

Using MCQs to develop and determine deep learning

MCQs are often seen as a necessary evil in assessment. We use them when we have to cover a huge breadth of content knowledge. Along with this is the tacit and sometimes explicit assumption that MCQs are for determining surface-level learning.  CaBA allows us to do some ‘myth busting’ and demonstrate that in fact, you can use MCQs as a route to students demonstrating characteristics of deep learning.

In these examples from The Center for Teaching at Vanderbilt University, we can see how a case-based approach shifts the nature of the MCQs from ‘right answer’ to ‘best answer.’ With this shift in focus comes a shift in cognition. Students are not being asked to apply a correct, rote piece of information. Instead, they are being challenged to analyse the elements of the case and prioritize the information therein. Thus, CaBA offers us the opportunity to use MCQs in ways that may align with more sophisticated subject outcomes and push students to do more than demonstrate memorization.

One proviso- writing good MCQs is challenging. Writing deeper learning MCQs is very challenging. I strongly recommend getting some experience and deepening your own knowledge of MCQ writing before attempting this. Of course, you are always welcome to reach out to me or one of my colleagues at MCSHE or Learning Environments to discuss how to advance in this!

Further learning and application

This article provides a brief overview of CaBA. To learn more about CaBA and some of the related topics touched on in this article, here are some resources and opportunities.

  • The Melbourne Centre for the Study of Higher Education features a host of relevant guides and resource materials. One in particular is ‘Multiple Choice Questions: An introductory guide.’ This guide demonstrates ways in which MCQs can be developed for deeper/higher level thinking.
  • UNSW’s Teaching Gateway provides a comprehensive overview of case-based assessment approaches and resources, as well as examples of how UNSW have integrated CaBA into their subjects.
  • TEQSA’s Expert Advice Hub features several relevant short guides, including a guide to adopting open-book exams and using authentic assessments in online delivery modes.
  • Vanderbilt University’s guide to writing better MCQs features several CaBA examples.
  • I am always happy to talk with colleagues about assessment. You can drop by for one of my Q&A Sessions on assessment, or attend one of the many workshops that LE and MCSHE host.

General resources

Student at laptop thinking