Login / Signup

A comparative study of pretrained language models for long clinical text.

Yikuan LiRamsey M WehbeFaraz S AhmadHanyin WangYuan Luo
Published in: Journal of the American Medical Informatics Association : JAMIA (2022)
This study demonstrates that clinical knowledge-enriched long-sequence transformers are able to learn long-term dependencies in long clinical text. Our methods can also inspire the development of other domain-enriched long-sequence transformers.
Keyphrases