Login / Signup

Interindividual differences influence multisensory processing during spatial navigation.

Silvia ZanchiLuigi F CuturiGiulio SandiniMonica Gori
Published in: Journal of experimental psychology. Human perception and performance (2022)
When moving through space, we encode multiple sensory cues that guide our orientation through the environment. The integration between visual and self-motion cues is known to improve navigation. However, spatial navigation may also benefit from multisensory external signals. The present study aimed to investigate whether humans combine auditory and visual landmarks with improving their navigation abilities. Two experiments with different cue reliability were conducted. In both, participants' task was to return an object to its original location by using landmarks, which could be visual-only, auditory-only, or audiovisual. We took error and variability of object relocation distance as measures of accuracy and precision. To quantify interference between cues and assess their weights, we ran a conflict condition with a spatial discrepancy between visual and auditory landmarks. Results showed comparable accuracy and precision when navigating with visual-only and audiovisual landmarks but greater error and variability with auditory-only landmarks. Splitting participants into two groups based on given unimodal weights revealed that only subjects who associated similar weights to auditory and visual cues showed precision benefit in audiovisual conditions. These findings suggest that multisensory integration occurs depending on idiosyncratic cue weighting. Future multisensory procedures to aid mobility must consider individual differences in encoding landmarks. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
Keyphrases
  • working memory
  • hearing loss
  • current status
  • drug induced