OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 16.03.2026, 19:44

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Bootstrapping Large Language Models as Clinical Predictors for Multimodal Irregular Time Series

2025·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2025

Jahr

Abstract

Recently, leveraging the medical knowledge embedded in pre-trained large language models (LLMs) to enhance clinical prediction (e.g. patient deterioration forecasting) using multimodal irregular clinical time series has become a bright yet challenging research direction. In this study, we identify a couple of major challenges currently faced by this direction: 1) LLMs could miss most of temporal dependencies among MICTS tokens if we directly input MICTS data into them in a textual format. 2) Deploying an upstream encoder for temporal modeling MICTS produces MICTS representations neither understandable nor applicable to LLMs. To tackle these challenges, we 1) decouple them; 2) propose Snow-CTE, a single-stream encoder to effectively model temporal dependencies (particularly longrange) and mitigate feature heterogeneity in MICTS; 3) propose CT-QFormer to “translate” the sequential MICTS representations (output by Snow-CTE) into real-time clinical condition information understandable to LLMs. Experimental results on benchmark clinical prediction tasks show that Snow-CTE + CTQFormer outperforms SOTA baselines in bootstrapping LLMs to perform clinical prediction using MICTS.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Machine Learning in HealthcareArtificial Intelligence in Healthcare and EducationTopic Modeling
Volltext beim Verlag öffnen