OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 10.05.2026, 05:36

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Response Quality in Human-Chatbot Collaborative Systems

2020·44 ZitationenOpen Access
Volltext beim Verlag öffnen

44

Zitationen

2

Autoren

2020

Jahr

Abstract

We report the results of a crowdsourcing user study for evaluating the effectiveness of human-chatbot collaborative conversation systems, which aim to extend the ability of a human user to answer another person's requests in a conversation using a chatbot. We examine the quality of responses from two collaborative systems and compare them with human-only and chatbot-only settings. Our two systems both allow users to formulate responses based on a chatbot's top-ranked results as suggestions. But they encourage the synthesis of human and AI outputs to a different extent. Experimental results show that both systems significantly improved the informativeness of messages and reduced user effort compared with a human-only baseline while sacrificing the fluency and humanlikeness of the responses. Compared with a chatbot-only baseline, the collaborative systems provided comparably informative but more fluent and human-like messages.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Mobile Crowdsensing and CrowdsourcingAI in Service InteractionsArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen