Login / Signup

A bilingual speech neuroprosthesis driven by cortical articulatory representations shared between languages.

Alexander B SilvaJessie R LiuSean L MetzgerIlina Bhaya-GrossmanMaximilian E DoughertyMargaret P SeatonKaylo T LittlejohnAdelyn Tu-ChanKarunesh GangulyDavid A MosesEdward F Chang
Published in: Nature biomedical engineering (2024)
Advancements in decoding speech from brain activity have focused on decoding a single language. Hence, the extent to which bilingual speech production relies on unique or shared cortical activity across languages has remained unclear. Here, we leveraged electrocorticography, along with deep-learning and statistical natural-language models of English and Spanish, to record and decode activity from speech-motor cortex of a Spanish-English bilingual with vocal-tract and limb paralysis into sentences in either language. This was achieved without requiring the participant to manually specify the target language. Decoding models relied on shared vocal-tract articulatory representations across languages, which allowed us to build a syllable classifier that generalized across a shared set of English and Spanish syllables. Transfer learning expedited training of the bilingual decoder by enabling neural data recorded in one language to improve decoding in the other language. Overall, our findings suggest shared cortical articulatory representations that persist after paralysis and enable the decoding of multiple languages without the need to train separate language-specific decoders.
Keyphrases
  • autism spectrum disorder
  • deep learning
  • working memory
  • machine learning
  • hearing loss
  • artificial intelligence
  • electronic health record
  • big data
  • high speed
  • neural network