OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 06.04.2026, 11:56

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Comparing Human- and ChatGPT-Generated Multiple-Choice Questions in Athletic Training Education

2025·1 Zitationen·Journal of Athletic Training Education and Practice
Volltext beim Verlag öffnen

1

Zitationen

2

Autoren

2025

Jahr

Abstract

Context Creating well-written multiple-choice questions (MCQs) requires time and attention to detail. Artificial intelligence tools such as ChatGPT have the potential to assist faculty members in creating exam or practice questions. Objective To compare human-generated athletic training–related MCQs with those generated by ChatGPT for quality, clarity, relevance, and difficulty of the questions. Design Cross-sectional study. Patients or Other Participants Ninety-three athletic training faculty teaching in Commission on Accreditation of Athletic Training Education–accredited entry-level athletic training programs completed the survey. Eleven second-year graduate-level athletic training students completed the 20-question quiz. Main Outcome Measure(s) Faculty participants completed a 2-part survey in which they evaluated 10 pairs of MCQs for grammar, clarity, difficulty, terminology, and suitability using a 5-point Likert scale, and indicated which question they preferred. Each pair included a human-generated question and a ChatGPT-generated question on a similar topic. A student quiz was developed to evaluate question quality/difficulty. Second-year master’s students nearing graduation were asked to complete the 20-question quiz using the same questions found in the faculty survey. Results ChatGPT-generated Board of Certification–style questions used in this study have similar values for grammar, stem quality, answer quality, question difficulty, proper use of medical terminology, and suitability for content to human-generated questions for all 5 athletic training domains. Most ChatGPT-generated questions were easy to understand, used appropriate terminology, and had answer options that were similar in style and length. Conclusions ChatGPT is another tool that athletic training faculty may consider using to improve the quality and efficacy of exam question preparation. The data from this study suggest that faculty can effectively use ChatGPT for exam question preparation; however, faculty should understand that ChatGPT, like all tools, has its limitations.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen