Login / Signup

Generative large language models are all-purpose text analytics engines: text-to-text learning is all your need.

Cheng PengXi YangAokun ChenZehao YuKaleb E SmithAnthony B CostaMona G FloresJiang BianYonghui Wu
Published in: Journal of the American Medical Informatics Association : JAMIA (2024)
The proposed approach achieved state-of-the-art performance for 5 out of 7 major clinical NLP tasks using one unified generative LLM. Our approach outperformed previous task-specific transformer models by ∼3% for concept extraction and 7% for relation extraction applied to social determinants of health, 3.4% for clinical concept normalization, 3.4%-10% for clinical abbreviation disambiguation, and 5.5%-9% for natural language inference. Our approach also outperformed a previously developed prompt-based machine reading comprehension (MRC) model, GatorTron-MRC, for clinical concept and relation extraction. The proposed approach can deliver the "one model for all" promise from training to deployment using a unified generative LLM.
Keyphrases
  • healthcare
  • autism spectrum disorder
  • smoking cessation
  • working memory
  • big data
  • social media
  • neural network