ClinicalT5: A Generative Language Model for Clinical Text

Qiuhao Lu, Dejing Dou, Thien Nguyen


Abstract
In the past few years, large pre-trained language models (PLMs) have been widely adopted in different areas and have made fundamental improvements over a variety of downstream tasks in natural language processing (NLP). Meanwhile, domain-specific variants of PLMs are being proposed to address the needs of domains that demonstrate a specific pattern of writing and vocabulary, e.g., BioBERT for the biomedical domain and ClinicalBERT for the clinical domain. Recently, generative language models like BART and T5 are gaining popularity with their competitive performance on text generation as well as on tasks cast as generative problems. However, in the clinical domain, such domain-specific generative variants are still underexplored. To address this need, our work introduces a T5-based text-to-text transformer model pre-trained on clinical text, i.e., ClinicalT5. We evaluate the proposed model both intrinsically and extrinsically over a diverse set of tasks across multiple datasets, and show that ClinicalT5 dramatically outperforms T5 in the domain-specific tasks and compares favorably with its close baselines.
Anthology ID:
2022.findings-emnlp.398
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5436–5443
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.398
DOI:
10.18653/v1/2022.findings-emnlp.398
Bibkey:
Cite (ACL):
Qiuhao Lu, Dejing Dou, and Thien Nguyen. 2022. ClinicalT5: A Generative Language Model for Clinical Text. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 5436–5443, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
ClinicalT5: A Generative Language Model for Clinical Text (Lu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.398.pdf
Note:
 2022.findings-emnlp.398.note.pdf