Evidence for a deep, distributed and dynamic code for animacy in human ventral anterior temporal cortex.
Timothy T RogersChristopher R CoxQihong LuAkihiro ShimotakeTakayuki KikuchiTakeharu KuniedaSusumu MiyamotoRyosuke TakahashiAkio IkedaRiki MatsumotoMatthew A Lambon RalphPublished in: eLife (2021)
How does the human brain encode semantic information about objects? This paper reconciles two seemingly contradictory views. The first proposes that local neural populations independently encode semantic features; the second, that semantic representations arise as a dynamic distributed code that changes radically with stimulus processing. Combining simulations with a well-known neural network model of semantic memory, multivariate pattern classification, and human electrocorticography, we find that both views are partially correct: information about the animacy of a depicted stimulus is distributed across ventral temporal cortex in a dynamic code possessing feature-like elements posteriorly but with elements that change rapidly and nonlinearly in anterior regions. This pattern is consistent with the view that anterior temporal lobes serve as a deep cross-modal 'hub' in an interactive semantic network, and more generally suggests that tertiary association cortices may adopt dynamic distributed codes difficult to detect with common brain imaging methods.
Keyphrases
- neural network
- endothelial cells
- spinal cord
- working memory
- machine learning
- functional connectivity
- induced pluripotent stem cells
- pluripotent stem cells
- high resolution
- deep brain stimulation
- molecular dynamics
- health information
- multiple sclerosis
- white matter
- cerebral ischemia
- prefrontal cortex
- fluorescence imaging
- bioinformatics analysis