Leveraging facial expressions and contextual information to investigate opaque representations of emotions.
Stefano AnzellottiSean Dae HoulihanSamuel T LiburdRebecca R SaxePublished in: Emotion (Washington, D.C.) (2019)
Observers attribute emotions to others relying on multiple cues, including facial expressions and information about the situation. Recent research has used Bayesian models to study how these cues are integrated. Existing studies have used a variety of tasks to probe emotion inferences, but limited attention has been devoted to the possibility that different decision processes might be involved depending on the task. If this is the case, understanding emotion representations might require understanding the decision processes through which they give rise to judgments. This article 1) shows that the different tasks that have been used in the literature yield very different results, 2) proposes an account of the decision processes involved that explain the differences, and 3) tests novel predictions of this account. The results offer new insights into how emotions are represented, and more broadly demonstrate the importance of taking decision processes into account in Bayesian models of cognition. (PsycInfo Database Record (c) 2021 APA, all rights reserved).