OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 12.03.2026, 05:41

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Enhancing UX Evaluation Through Collaboration with Conversational AI Assistants: Effects of Proactive Dialogue and Timing

2024·28 ZitationenOpen Access
Volltext beim Verlag öffnen

28

Zitationen

4

Autoren

2024

Jahr

Abstract

Usability testing is vital for enhancing the user experience (UX) of interactive systems. However, analyzing test videos is complex and resource-intensive. Recent AI advancements have spurred exploration into human-AI collaboration for UX analysis, particularly through natural language. Unlike user-initiated dialogue, our study investigated the potential of proactive conversational assistants to aid UX evaluators through automatic suggestions at three distinct times: before, in sync with, and after potential usability problems. We conducted a hybrid Wizard-of-Oz study involving 24 UX evaluators, using ChatGPT to generate automatic problem suggestions and a human actor to respond to impromptu questions. While timing did not significantly impact analytic performance, suggestions appearing after potential problems were preferred, enhancing trust and efficiency. Participants found the automatic suggestions useful, but they collectively identified more than twice as many problems, underscoring the irreplaceable role of human expertise. Our findings also offer insights into future human-AI collaborative tools for UX evaluation.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

AI in Service InteractionsSocial Robot Interaction and HRIArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen