OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 18.03.2026, 23:05

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Transformer Language Models for Neurology Research with Electronic Health Records: Current State of the Science

2025·0 Zitationen·Seminars in Neurology
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2025

Jahr

Abstract

This review provides an overview of the emergence and application of transformer-based language models in electronic health records in neurology. Transformer architectures are well-suited for neurological data due to their ability to model complex spatiotemporal patterns and capture long-range dependencies, both characteristic of neurological conditions and their documentation. We introduce the foundational principles of transformer models and outline the model training and evaluation frameworks commonly used in clinical text processing. We then examine current applications of transformers in neurology, spanning disease detection and diagnosis, phenotyping and symptom extraction, and outcome and prognosis prediction, and synthesize emerging patterns in model adaptation and evaluation strategies. Additionally, we discuss the limitations of current models, including generalizability, model bias, and data privacy, and propose future directions for research and implementation. By synthesizing recent advances, this review aims to guide future efforts in leveraging transformer-based language models to improve neurological care and research.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Machine Learning in HealthcareArtificial Intelligence in Healthcare and EducationGenomics and Rare Diseases
Volltext beim Verlag öffnen