Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Real-Time Context-aware Detection of Unsafe Events in Robot-Assisted\n Surgery
1
Zitationen
2
Autoren
2020
Jahr
Abstract
Cyber-physical systems for robotic surgery have enabled minimally invasive\nprocedures with increased precision and shorter hospitalization. However, with\nincreasing complexity and connectivity of software and major involvement of\nhuman operators in the supervision of surgical robots, there remain significant\nchallenges in ensuring patient safety. This paper presents a safety monitoring\nsystem that, given the knowledge of the surgical task being performed by the\nsurgeon, can detect safety-critical events in real-time. Our approach\nintegrates a surgical gesture classifier that infers the operational context\nfrom the time-series kinematics data of the robot with a library of erroneous\ngesture classifiers that given a surgical gesture can detect unsafe events. Our\nexperiments using data from two surgical platforms show that the proposed\nsystem can detect unsafe events caused by accidental or malicious faults within\nan average reaction time window of 1,693 milliseconds and F1 score of 0.88 and\nhuman errors within an average reaction time window of 57 milliseconds and F1\nscore of 0.76.\n