Login / Signup

Multisensory causal inference is feature-specific, not object-based.

Stephanie BaddeMichael S LandyWendy J Adams
Published in: Philosophical transactions of the Royal Society of London. Series B, Biological sciences (2023)
Multisensory integration depends on causal inference about the sensory signals. We tested whether implicit causal-inference judgements pertain to entire objects or focus on task-relevant object features. Participants in our study judged virtual visual, haptic and visual-haptic surfaces with respect to two features-slant and roughness-against an internal standard in a two-alternative forced-choice task. Modelling of participants' responses revealed that the degree to which their perceptual judgements were based on integrated visual-haptic information varied unsystematically across features. For example, a perceived mismatch between visual and haptic roughness would not deter the observer from integrating visual and haptic slant. These results indicate that participants based their perceptual judgements on a feature-specific selection of information, suggesting that multisensory causal inference proceeds not at the object level but at the level of single object features. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Keyphrases
  • working memory
  • virtual reality
  • single cell
  • machine learning
  • deep learning
  • mental health
  • physical activity
  • health information
  • escherichia coli
  • social support
  • cystic fibrosis
  • neural network