Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
A modular and interpretable framework for tabular data analysis using LLaMA 7B: Enhancing preprocessing, modeling, and explainability with local language models
0
Zitationen
2
Autoren
2026
Jahr
Abstract
Predicting whether a patient will attend a scheduled medical appointment is essential for reducing inefficiencies in healthcare systems and optimizing resource allocation. This study introduces a local, LLM-assisted pipeline that uses LLaMA 7B solely to automate semantic preprocessing such as column renaming, datatype inference, and cleaning recommendations while the predictive task is performed by classical machine-learning models. Applied to the Medical Appointment No-Shows dataset, the pipeline spans dataset analysis, feature transformation, classification, SHAP-based explainability, and system profiling. Using LLM-guided preprocessing, the downstream XGBoost classifier achieved an overall accuracy of 80%, with an F1-score of 0.89 for the majority Show class and 0.03 for the minority No-show class, reflecting the strong class imbalance in the dataset. The AUC-ROC reached 0.65 and the precision-recall AUC was 0.87, driven primarily by majority-class performance. SHAP analysis identified waiting days, age, and SMS notifications as the most influential predictors. Overall, the results demonstrate that local large language models can enhance preprocessing and interpretability within an efficient, deployable workflow for tabular prediction tasks, while classical supervised models remain responsible for final prediction.
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.294 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.666 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.189 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.588 Zit.
Artificial intelligence in healthcare: past, present and future
2017 · 4.405 Zit.