Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Risk-aware feature-token attention with subgroup-calibrated fairness regularization for survival
0
Zitationen
3
Autoren
2026
Jahr
Abstract
Survival analysis in healthcare can be plagued by demographic imbalances due to fairness-oblivious model assumptions or training data biases, especially in clinically consequential domains, such as allogeneic hematopoietic cell transplantation (HCT). To address this problem, we introduce Fair Survival Analysis Transformer (FairSurvTrans), a transformer model for survival forecasting that draws ideas from the design of a large language model (LLM). The model applies positional encoding and multi-head self-attention mechanisms to structured clinical data while incorporating a dynamic fairness-conscious loss function that penalizes performance inequality across the racial subgroups. When trained using synthetically balanced HCT data, FairSurvTrans demonstrated superior predictive performance with an overall C-index value of $$0.6738 \pm 0.0031$$, while achieving the lowest fairness penalty ($$0.0002 \pm 0.0001$$) among benchmark models such as CoxPH, DeepSurv, and XGBoost-AFT. The model also made well-calibrated risk predictions for 1-year survival while maintaining an accuracy consistent across the six predefined racial groups. These results demonstrate the utility of transformer models inspired by LLM architectures in equitably facilitating interpretable, clinically actionable survival modelling, enabling more responsible deployment of AI in healthcare.
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.259 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.629 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.143 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.535 Zit.
Artificial intelligence in healthcare: past, present and future
2017 · 4.396 Zit.