Immersive scene representation in human visual cortex with ultra-wide-angle neuroimaging.
Jeongho ParkEdward SoucyJennifer SegawaRoss MairTalia KonklePublished in: Nature communications (2024)
While human vision spans 220°, traditional functional MRI setups display images only up to central 10-15°. Thus, it remains unknown how the brain represents a scene perceived across the full visual field. Here, we introduce a method for ultra-wide angle display and probe signatures of immersive scene representation. An unobstructed view of 175° is achieved by bouncing the projected image off angled-mirrors onto a custom-built curved screen. To avoid perceptual distortion, scenes are created with wide field-of-view from custom virtual environments. We find that immersive scene representation drives medial cortex with far-peripheral preferences, but shows minimal modulation in classic scene regions. Further, scene and face-selective regions maintain their content preferences even with extreme far-periphery stimulation, highlighting that not all far-peripheral information is automatically integrated into scene regions computations. This work provides clarifying evidence on content vs. peripheral preferences in scene representation and opens new avenues to research immersive vision.
Keyphrases
- virtual reality
- high resolution
- endothelial cells
- deep learning
- magnetic resonance imaging
- climate change
- healthcare
- mental health
- decision making
- gene expression
- induced pluripotent stem cells
- machine learning
- working memory
- white matter
- contrast enhanced
- multiple sclerosis
- mass spectrometry
- pluripotent stem cells
- social media
- dna methylation
- brain injury
- functional connectivity
- single cell