Complexity of STG signals and linguistic rhythm: a methodological study for EEG data.
Silvana Silva PereiraEge Ekin ÖzerNúria Sebastián-GallésPublished in: Cerebral cortex (New York, N.Y. : 1991) (2024)
The superior temporal and the Heschl's gyri of the human brain play a fundamental role in speech processing. Neurons synchronize their activity to the amplitude envelope of the speech signal to extract acoustic and linguistic features, a process known as neural tracking/entrainment. Electroencephalography has been extensively used in language-related research due to its high temporal resolution and reduced cost, but it does not allow for a precise source localization. Motivated by the lack of a unified methodology for the interpretation of source reconstructed signals, we propose a method based on modularity and signal complexity. The procedure was tested on data from an experiment in which we investigated the impact of native language on tracking to linguistic rhythms in two groups: English natives and Spanish natives. In the experiment, we found no effect of native language but an effect of language rhythm. Here, we compare source projected signals in the auditory areas of both hemispheres for the different conditions using nonparametric permutation tests, modularity, and a dynamical complexity measure. We found increasing values of complexity for decreased regularity in the stimuli, giving us the possibility to conclude that languages with less complex rhythms are easier to track by the auditory cortex.
Keyphrases
- autism spectrum disorder
- working memory
- hearing loss
- functional connectivity
- resting state
- electronic health record
- atrial fibrillation
- heart rate
- big data
- spinal cord
- climate change
- minimally invasive
- oxidative stress
- multidrug resistant
- machine learning
- spinal cord injury
- blood pressure
- deep learning
- data analysis
- artificial intelligence
- density functional theory