Evidence for Independent Processing of Shape by Vision and Touch.
Ryan L MillerDavid L SheinbergPublished in: eNeuro (2022)
Although visual object recognition is well studied and relatively well understood, much less is known about how shapes are recognized by touch and how such haptic stimuli might be compared with visual shapes. One might expect that the processes of visual and haptic object recognition engage similar brain structures given the advantages of avoiding redundant brain circuitry and indeed there is some evidence that this is the case. A potentially fruitful approach to understanding the differences in how shapes might be neurally represented is to find an algorithmic method of comparing shapes, which agrees with human behavior and determines whether that method differs between different modality conditions. If not, it would provide further evidence for a shared representation of shape. We recruited human participants to perform a one-back same-different visual and haptic shape comparison task both within (i.e., comparing two visual shapes or two haptic shapes) and across (i.e., comparing visual with haptic shapes) modalities. We then used various shape metrics to predict performance based on the shape, orientation, and modality of the two stimuli that were being compared on each trial. We found that the metrics that best predict shape comparison behavior heavily depended on the modality of the two shapes, suggesting differences in which features are used for comparing shapes depending on modality and that object recognition is not necessarily performed in a single, modality-agnostic region.