Certainty-Based Marking on Multiple-Choice Items: Psychometrics Meets Decision Theory.
Qian WuMonique VanerumAnouk AgtenAndrés ChristiansenFrank VandenabeeleJean-Michel RigoRianne JanssenPublished in: Psychometrika (2021)
When a response to a multiple-choice item consists of selecting a single-best answer, it is not possible for examiners to differentiate between a response that is a product of knowledge and one that is largely a product of uncertainty. Certainty-based marking (CBM) is one testing format that requires examinees to express their degree of certainty on the response option they have selected, leading to an item score that depends both on the correctness of an answer and the certainty expressed. The expected score is maximized if examinees truthfully report their level of certainty. However, prospect theory states that people do not always make rational choices of the optimal outcome due to varying risk attitudes. By integrating a psychometric model and a decision-making perspective, the present study looks into the response behaviors of 334 first-year students of physiotherapy on six multiple-choice examinations with CBM in a case study. We used item response theory to model the objective probability of students giving a correct response to an item, and cumulative prospect theory to estimate their risk attitudes when students choose to report their certainty. The results showed that with the given CBM scoring matrix, students' choices of a certainty level were affected by their risk attitudes. Students were generally risk averse and loss averse when they had a high success probability on an item, leading to an under-reporting of their certainty. Meanwhile, they were risk seeking in case of small success probabilities on the items, resulting in the over-reporting of certainty.