Immersive scene representation in human visual cortex with ultra-wide-angle neuroimaging.
Jeongho ParkEdward SoucyJennifer SegawaRoss MairTalia KonklePublished in: Nature communications (2024)
While human vision spans 220°, traditional functional MRI setups display images only up to central 10-15°. Thus, it remains unknown how the brain represents a scene perceived across the full visual field. Here, we introduce a method for ultra-wide angle display and probe signatures of immersive scene representation. An unobstructed view of 175° is achieved by bouncing the projected image off angled-mirrors onto a custom-built curved screen. To avoid perceptual distortion, scenes are created with wide field-of-view from custom virtual environments. We find that immersive scene representation drives medial cortex with far-peripheral preferences, but shows minimal modulation in classic scene regions. Further, scene and face-selective regions maintain their content preferences even with extreme far-periphery stimulation, highlighting that not all far-peripheral information is automatically integrated into scene regions computations. This work provides clarifying evidence on content vs. peripheral preferences in scene representation and opens new avenues to research immersive vision.
Keyphrases
- virtual reality
- endothelial cells
- high resolution
- deep learning
- magnetic resonance imaging
- depressive symptoms
- induced pluripotent stem cells
- physical activity
- multiple sclerosis
- decision making
- resting state
- gene expression
- machine learning
- functional connectivity
- magnetic resonance
- healthcare
- white matter
- contrast enhanced
- high throughput
- convolutional neural network
- genome wide
- diffusion weighted imaging
- blood brain barrier
- living cells