Building transformers from neurons and astrocytes.
Leo KozachkovKsenia V KastanenkaDmitry KrotovPublished in: Proceedings of the National Academy of Sciences of the United States of America (2023)
Glial cells account for between 50% and 90% of all human brain cells, and serve a variety of important developmental, structural, and metabolic functions. Recent experimental efforts suggest that astrocytes, a type of glial cell, are also directly involved in core cognitive processes such as learning and memory. While it is well established that astrocytes and neurons are connected to one another in feedback loops across many timescales and spatial scales, there is a gap in understanding the computational role of neuron-astrocyte interactions. To help bridge this gap, we draw on recent advances in AI and astrocyte imaging technology. In particular, we show that neuron-astrocyte networks can naturally perform the core computation of a Transformer, a particularly successful type of AI architecture. In doing so, we provide a concrete, normative, and experimentally testable account of neuron-astrocyte communication. Because Transformers are so successful across a wide variety of task domains, such as language, vision, and audition, our analysis may help explain the ubiquity, flexibility, and power of the brain's neuron-astrocyte networks.
Keyphrases
- induced apoptosis
- cell cycle arrest
- artificial intelligence
- spinal cord
- cell death
- neuropathic pain
- autism spectrum disorder
- signaling pathway
- stem cells
- oxidative stress
- quality improvement
- white matter
- single cell
- cell proliferation
- mesenchymal stem cells
- multiple sclerosis
- resting state
- pi k akt
- blood brain barrier
- data analysis