Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Enhancing Drug Recommendations Via Heterogeneous Graph Representation Learning in EHR Networks
24
Zitationen
4
Autoren
2023
Jahr
Abstract
Electronic health records (EHRs) contain vast medical information like diagnosis, medication, and procedures, enabling personalized drug recommendations and treatment adjustments. However, current drug recommendation methods only model patients' health conditions from EHR data, neglecting the rich relationships within the data. This paper seeks to utilize a heterogeneous information network (HIN) to represent EHR and develop a graph representation learning method for medication recommendation. However, three critical issues need to be investigated: (1) co-occurrence of diagnosis and drug for the same patient does not imply their relevance; (2) patients' directly associated information may not be sufficient to reflect their health conditions; and (3) the cold start problem exists when patients have no historical EHRs. To tackle these challenges, we develop a bi-channel heterogeneous local structural encoder to decouple and extract the diverse information in HIN. Additionally, a global information capture and fusion module, aggregating meta-paths to form a global representation, is introduced to fill the information gaps in records. A longitudinal model using rich structural information available in EHR data is proposed for drug recommendations to new patients. Experimental results on real-world EHR data demonstrate significant improvements over existing approaches.
Ähnliche Arbeiten
Detecting Functionality-Specific Vulnerabilities via Retrieving Individual Functionality-Equivalent APIs in Open-Source Repositories
2025 · 16.105 Zit.
A tutorial on spectral clustering
2007 · 10.164 Zit.
The Graph Neural Network Model
2008 · 9.081 Zit.
Authoritative sources in a hyperlinked environment
1999 · 9.029 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.883 Zit.