OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 06.05.2026, 04:24

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Assessing the suitability of ChatGPT in responding to public inquiries about dental crown restorations

2025·0 Zitationen·BMC Oral HealthOpen Access
Volltext beim Verlag öffnen

0

Zitationen

6

Autoren

2025

Jahr

Abstract

AIM: This study aimed to assess the effectiveness of ChatGPT as an informational tool for patients considering dental crown restorations. METHODS: Common patient inquiries about dental crown restorations were identified through online tools: Also Asked, Answer the Public, and Google's People Also Ask. The questions were categorized into three groups: Group 1 - functionality and appearance, Group 2 - material differences and cost, and Group 3 - longevity, shortcomings, and maintenance. ChatGPT 3.5 was used to generate responses to these questions. Responses were evaluated for usefulness; readability (FKG and SMOG indices); quality (Global Quality Scale); reliability (CLEAR tool); and understandability and actionability (PEMAT). Statistical analyses included ANOVA, Kruskal-Wallis, Mann-Whitney U, and Spearman correlation tests, performed using SPSS v20. RESULTS: A total of 126 patient questions about dental crown restorations were analyzed. Most ChatGPT responses were rated as 'very useful' (77%) or 'useful' (20.6%), with responses on longevity, shortcomings, and maintenance receiving the highest usefulness ratings. Readability analyses indicated that approximately 53% of responses were difficult to read overall, though responses on longevity and maintenance were easier to read. Quality and reliability were high across all responses, with mean GQS and CLEAR scores of 4.70/5 and 24.37/25, respectively. Understandability and actionability (PEMAT) scores varied, with responses on longevity and maintenance performing best. CONCLUSION: Our results suggest that ChatGPT may serve as a supplementary tool in addressing patient inquiries about dental crown restorations. However, its responses require improvement in readability, actionability, and technical specificity. These findings indicate that while AI tools like ChatGPT can complement patient education, they cannot replace professional dental consultation and guidance.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and EducationClinical Reasoning and Diagnostic SkillsRadiology practices and education
Volltext beim Verlag öffnen