Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
XynAI: A Conversational Multi-Modal AI for Integrated Cognitive, Behavioral, and Clinical Health Assessment.
0
Zitationen
5
Autoren
2025
Jahr
Abstract
Abstract - Conventional healthcare frameworks typically gather patient information in isolated and time-bound episodes, which delays diagnosis and limits the continuity of clinical interpretation. This research presents XynAI, an integrated cognitive and behavioral health assessment platform that unifies conversational AI, adaptive cognitive training, multi-modal Parkinson’s diagnostics, and AI-driven medical document extraction. The system analyzes text using ONNX-based sentiment and mental-state models, conducts motor and speech assessments using tapping-latency, drawing-trajectory and vowel-phonation metrics, and extracts clinical entities from unstructured medical PDFs using Docling OCR and LLM-based semantic parsing. Data from multiple modalities are standardized and integrated through a secure Supabase PostgreSQL architecture, enabling the generation of continuous, longitudinal clinical insights. The platform demonstrates the feasibility of delivering continuous, personalized, and explainable digital health assessment across cognitive, neurological, and behavioral dimensions. Key Words: multi-modal health analysis, conversational AI, cognitive impairment, Parkinson’s screening, document intelligence, explainable AI.
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.210 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.586 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.102 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.468 Zit.
Artificial intelligence in healthcare: past, present and future
2017 · 4.383 Zit.