Login / Signup

Human EEG and artificial neural networks reveal disentangled representations of object real-world size in natural images.

Zitong LuJulie D Golomb
Published in: bioRxiv : the preprint server for biology (2023)
Human brains have the ability to accurately perceive and process the real-world size of objects, despite vast differences in distance and perspective, which is a remarkable feat of cognitive processing. However, previous studies faced challenges distinguishing perceived real-world size from perceived real-world depth, muddying result interpretations. We disentangled representations of real-world size from visual size and perceived real-world depth in human brains and artificial neural networks using the THINGS EEG2 dataset, which offers more ecologically valid naturalistic stimuli compared to the use of object stimuli without backgrounds. Our EEG representational similarity results revealed a pure representation of object real-world size in human brains. A visual object processing timeline was uncovered: pixel-wise differences appeared first, then real-world depth and visual size, and finally, real-world size. Furthermore, representational comparisons with different artificial neural networks reveal real-world size as a stable and higher-level dimension in object space incorporating both visual and semantic information.
Keyphrases