OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 03.05.2026, 23:13

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

BERT-based Transfer Learning in Sentence-level Anatomic Classification of Free-Text Radiology Reports

2023·17 Zitationen·Radiology Artificial IntelligenceOpen Access
Volltext beim Verlag öffnen

17

Zitationen

8

Autoren

2023

Jahr

Abstract

Purpose: To assess whether transfer learning with a bidirectional encoder representations from transformers (BERT) model, pretrained on a clinical corpus, can perform sentence-level anatomic classification of free-text radiology reports, even for anatomic classes with few positive examples. Materials and Methods: This retrospective study included radiology reports of patients who underwent whole-body PET/CT imaging from December 2005 to December 2020. Each sentence in these reports (6272 sentences) was labeled by two annotators according to body part ("brain," "head & neck," "chest," "abdomen," "limbs," "spine," or "others"). The BERT-based transfer learning approach was compared with two baseline machine learning approaches: bidirectional long short-term memory (BiLSTM) and the count-based method. Area under the precision-recall curve (AUPRC) and area under the receiver operating characteristic curve (AUC) were computed for each approach, and AUCs were compared using the DeLong test. Results: values < .025). AUPRC results for BERT were superior to those of baselines even for classes with few labeled training data (brain: BERT, 0.95, BiLSTM, 0.11, count based, 0.41; limbs: BERT, 0.74, BiLSTM, 0.28, count based, 0.46; spine: BERT, 0.82, BiLSTM, 0.53, count based, 0.69). Conclusion: © RSNA, 2023.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and EducationRadiology practices and educationMedical Imaging and Analysis
Volltext beim Verlag öffnen