Resolving the time course of visual and auditory object categorization.
Polina IamshchininaAgnessa KarapetianDaniel KaiserRadoslaw Martin CichyPublished in: Journal of neurophysiology (2022)
Humans can effortlessly categorize objects, both when they are conveyed through visual images and spoken words. To resolve the neural correlates of object categorization, studies have so far primarily focused on the visual modality. It is therefore still unclear how the brain extracts categorical information from auditory signals. In the current study, we used EEG ( n = 48) and time-resolved multivariate pattern analysis to investigate 1 ) the time course with which object category information emerges in the auditory modality and 2 ) how the representational transition from individual object identification to category representation compares between the auditory modality and the visual modality. Our results show that 1 ) auditory object category representations can be reliably extracted from EEG signals and 2 ) a similar representational transition occurs in the visual and auditory modalities, where an initial representation at the individual-object level is followed by a subsequent representation of the objects' category membership. Altogether, our results suggest an analogous hierarchy of information processing across sensory channels. However, there was no convergence toward conceptual modality-independent representations, thus providing no evidence for a shared supramodal code. NEW & NOTEWORTHY Object categorization operates on inputs from different sensory modalities, such as vision and audition. This process was mainly studied in vision. Here, we explore auditory object categorization. We show that auditory object category representations can be reliably extracted from EEG signals and, similar to vision, auditory representations initially carry information about individual objects, which is followed by a subsequent representation of the objects' category membership.