Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Real-Time Context-Aware Detection of Unsafe Events in Robot-Assisted Surgery
2
Zitationen
2
Autoren
2020
Jahr
Abstract
Cyber-physical systems for robotic surgery have enabled minimally invasive procedures with increased precision and shorter hospitalization. However, with increasing complexity and connectivity of software and major involvement of human operators in the supervision of surgical robots, there remain significant challenges in ensuring patient safety. This paper presents a safety monitoring system that, given the knowledge of the surgical task being performed by the surgeon, can detect safety-critical events in real-time. Our approach integrates a surgical gesture classifier that infers the operational context from the time-series kinematics data of the robot with a library of erroneous gesture classifiers that given a surgical gesture can detect unsafe events. Our experiments using data from two surgical platforms show that the proposed system can detect unsafe events caused by accidental or malicious faults within an average reaction time window of 1,693 milliseconds and F1 score of 0.88 and human errors within an average reaction time window of 57 milliseconds and F1 score of 0.76.
Ähnliche Arbeiten
The SCARE 2020 Guideline: Updating Consensus Surgical CAse REport (SCARE) Guidelines
2020 · 5.572 Zit.
Virtual Reality Training Improves Operating Room Performance
2002 · 2.787 Zit.
An estimation of the global volume of surgery: a modelling strategy based on available data
2008 · 2.506 Zit.
Objective structured assessment of technical skill (OSATS) for surgical residents
1997 · 2.258 Zit.
Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence
2011 · 1.705 Zit.