Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Non-user physician perspectives about an oncology clinical decision-support system: A qualitative study.
5
Zitationen
13
Autoren
2020
Jahr
Abstract
e14061 Background: Advances in artificial intelligence (AI) continue to expand capabilities within the healthcare domain, particularly in the discipline of oncology. Watson For Oncology (WfO) is an AI-enabled clinical decision support system that presents potential therapeutic options for cancer-treating physicians. The objectives of this study were to identify non-user physicians’ expectations, perceived challenges and benefits of WfO use in Brazil. Methods: The study took place at Instituto do Câncer do Ceará (ICC), a Brazilian oncology hospital that implemented WfO in December 2017, but not all physicians adopted the tool. Physicians who had not used WfO (n = 5) were recruited through purposive sampling identified with the assistance of local research personnel. Semi-structured interviews were conducted in Portuguese and later de-identified and transcribed into English. A thematic analysis of interview data based on grounded theory by two members of the research team with extensive experience in qualitative data analysis was conducted. Results: Non-user physicians had positive perceptions about WfO, along with several concerns and uncertainties. They expected that WfO would be easy to learn, useful, and helpful. Physicians perceived that WfO would provide a more standardized approach to treatment than care without it. They also believed that WfO would play a supportive and not a substitute role in care especially for complex cases in which the physicians had more in-depth knowledge of a patient and already had an established patient-provider relationship. Physicians did expect WfO use to negatively impact productivity, specifically through longer office times per patient because of the need to enter data and review recommendations. Physicians questioned whether the use of WfO would negatively impact their autonomy and role in providing care. Finally, physicians also questioned whether the treatment suggested by WfO would fit the social context of a low-middle income country such as Brazil with limited technological and economic resources. Conclusions: The implementation of US-developed AI technologies, such as WfO, should be further explored in different social and economic contexts. Physician concerns about productivity and autonomy need to be assessed and addressed in AI implementation; one strategy is to leverage previous lessons learned from electronic health record (EHR) implementations. This study is a critical step in understanding potential user perspectives in adopting a new AI tool in different social contexts.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.231 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.084 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.444 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.423 Zit.