Login / Signup

Laying the Foundation: Modern Transformers for Gold-Standard Sleep Analysis and Beyond.

William G CoonMattson Ogg
Published in: bioRxiv : the preprint server for biology (2024)
Accurate sleep assessment is critical to the practice of sleep medicine and sleep research. The recent availability of large quantities of publicly available sleep data, alongside recent breakthroughs in AI like transformer architectures, present novel opportunities for data-driven discovery efforts. Transformers are flexible neural networks that not only excel at classification tasks, but also can enable data-driven discovery through un- or self-supervised learning, which requires no human annotations to the input data. While transformers have been extensively used in supervised learning scenarios for sleep stage classification, they have not been fully explored or optimized in forms designed from the ground up for use in un- or self-supervised learning tasks in sleep. A necessary first step will be to study these models on a canonical benchmark supervised learning task (5-class sleep stage classification). Hence, to lay the groundwork for future data-driven discovery efforts, we evaluated optimizations of a transformer-based architecture that has already demonstrated substantial success in self-supervised learning in another domain (audio speech recognition), and trained it to perform the canonical 5-class sleep stage classification task, to establish foundational baselines in the sleep domain. We found that small transformer models designed from the start for (later) self-supervised learning can match other state-of-the-art automated sleep scoring techniques, while also providing the basis for future data-driven discovery efforts using large sleep data sets.
Keyphrases