OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 05.04.2026, 17:36

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

A Novel Trust State-Chart Model for Requirements Engineering of Trustful AI - Empowered Software

2023·16 Zitationen
Volltext beim Verlag öffnen

16

Zitationen

2

Autoren

2023

Jahr

Abstract

Human-AI interactions are becoming increasingly common in various applications, and the level of trust in AI-Enhanced Software systems significantly influences user experience. The approach is based on the five human states of trust, overtrust, mistrust, distrust and even AI -phobia, which are critical to the success of such interactions. By using this approach, the human-AI interface can be designed and developed to build trust, mitigate overtrust, address mistrust, and reduce distrust, or AI phobia, thereby improving the overall user experience. The paper discusses the challenges of requirements engineering of trustworthy human-AI interfaces and provides a detailed explanation of a novel Trust State-Chart Model. Moreover, it employs a use-case model based on the Trust-Flow Model and then a Hierarchical Task Analysis to lead to user experience requirements that improves trust. It also presents case studies and examples to demonstrate the effectiveness of the approach. The proposed approach can lead to user experience requirements that improve trust. By continuously monitoring user feedback and adapting the AI system’ s behaviour, the approach aims to enhance user experience and reduce the risk of AI phobia. The proposed approach can be applied in various human-AI interaction scenarios, including healthcare, finance, and education, to improve the quality of user experience in terms of trust.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Ethics and Social Impacts of AIHuman-Automation Interaction and SafetyArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen