Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Metadata Meets LLMs: Constructing Knowledge-Rich Citation Networks with CoT-Enhanced Representations
1
Zitationen
5
Autoren
2026
Jahr
Abstract
Recent advances in large language models (LLMs), such as GPT and Llama, have driven significant progress in natural language processing and diverse AI applications. In this paper, we explore how LLMs can enhance the construction of heterogeneous citation networks by integrating rich contextual information derived from LLMs. We propose a metadata-driven augmentation that generates concise factual descriptions for sparse fields in citation metadata, including keywords, venues, and author affiliations. These contexts are encoded with DeBERTa and integrated as node features in a knowledge-enriched heterogeneous network. Additionally, to mitigate LLM hallucinations, we employed Chain-of-Thought (CoT)-based prompting and evaluated the quality of the generated context. Experimental results demonstrate that our LLM-powered context augmentation improves author classification by 2.0%-4.5% and author clustering by 8.9%-18.1%, outperforming traditional feature engineering methods. The dataset and source code are available at https://github.com/inthwan/Metadata-Meets-LLMs.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.553 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.444 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.943 Zit.
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
2019 · 6.792 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.