OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 14.03.2026, 06:18

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Liability Risks of Ambient Clinical Workflows With Artificial Intelligence for Clinicians, Hospitals, and Manufacturers

2025·4 Zitationen·JCO Oncology PracticeOpen Access
Volltext beim Verlag öffnen

4

Zitationen

3

Autoren

2025

Jahr

Abstract

<div>In August 2024, the nation's largest nonprofit integrated health care provider, Kaiser Permanente, announced that clinicians would have access to an ambient clinical documentation scribe: an assisted clinical documentation tool that uses artificial intelligence (AI) to securely summarize relevant medical information from spoken, natural conversations (also called ambient clinical documentation or AI scribes). After automatically summarizing the encounter, the AI scribe sends the summary to the clinician for review. Ambient clinical documentation scribes are now offered by some of the fastest-growing AI companies in health care, with significant venture capital funding and an impressive roster of health system customers.</div> <div> </div> <div>Technologies such as ambient clinical documentation and other generative AI tools may improve care and lessen clinician burnout by reducing documentation burdens. But they also raise the question of who is responsible when AI-generated patient information is inaccurate, especially when those errors cause injury to a patient. This question is particularly acute in cancer care, where there is a unique set of terminology for each of the more than 400 types of cancer, leading to an increased chance of documentation error, and where decisions on the basis of the assumption of information accuracy can be life-altering.</div> <div> </div> <div>AI transcription tools in their current versions are not considered regulated medical devices under the US Federal Food, Drug, and Cosmetic Act. Unless this changes, the responsibility falls to stakeholders other than the US Food and Drug Administration (FDA) to ensure the technology's safety and efficacy. In this article, we analyze the AI governance responsibilities and potential tort liability for clinicians, hospitals, and manufacturers using AI for clinical note-taking and suggest several potential ways to address them.</div>

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and EducationElectronic Health Records SystemsEthics in Clinical Research
Volltext beim Verlag öffnen