Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Neurosymbolic AI for Enhanced Reasoning in Small Language Models
0
Zitationen
5
Autoren
2025
Jahr
Abstract
Large language models (LLMs) have computational requirements that are rapidly increasing as well as an opaque decision-making process, which makes them challenging to medical natural language inference, where efficiency and interpretability are paramount. To resolve these issues in resource-constrained clinical environments, this study presents a neurosymbolic system, a neural module based on DistilBERT and a knowledge graph to perform symbolic reasoning. Coupled with neural pattern recognition with structured domain knowledge, the suggested model can increase accuracy, lower energy usage, and provide a deeper understanding of decisions than standard LLMs. Considering the MedNLI dataset, the framework yields better results, compared to such baselines as BERT and GPT-3, with better classification accuracy and much less energy expenditure, which is in line with sustainable AI principles. Neural-symbolic interactions can also be clearly seen through visualization, such as reasoning graphs and attention heatmaps, which makes them trustworthy in clinical use. The efficiency of the model makes it suitable to use on low-power devices, and thus the model can be applicable in real-time medical decision-making in the resource-limited environment. The article enhances interpretable and sustainable AI by introducing a scalable solution to healthcare by bridging neural and symbolic paradigms.
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.198 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.576 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.084 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.444 Zit.
Artificial intelligence in healthcare: past, present and future
2017 · 4.382 Zit.