Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Surgical Activity Recognition in Robot-Assisted Radical Prostatectomy\n using Deep Learning
0
Zitationen
4
Autoren
2018
Jahr
Abstract
Adverse surgical outcomes are costly to patients and hospitals. Approaches to\nbenchmark surgical care are often limited to gross measures across the entire\nprocedure despite the performance of particular tasks being largely responsible\nfor undesirable outcomes. In order to produce metrics from tasks as opposed to\nthe whole procedure, methods to recognize automatically individual surgical\ntasks are needed. In this paper, we propose several approaches to recognize\nsurgical activities in robot-assisted minimally invasive surgery using deep\nlearning. We collected a clinical dataset of 100 robot-assisted radical\nprostatectomies (RARP) with 12 tasks each and propose `RP-Net', a modified\nversion of InceptionV3 model, for image based surgical activity recognition. We\nachieve an average precision of 80.9% and average recall of 76.7% across all\ntasks using RP-Net which out-performs all other RNN and CNN based models\nexplored in this paper. Our results suggest that automatic surgical activity\nrecognition during RARP is feasible and can be the foundation for advanced\nanalytics.\n
Ähnliche Arbeiten
The SCARE 2020 Guideline: Updating Consensus Surgical CAse REport (SCARE) Guidelines
2020 · 5.574 Zit.
Virtual Reality Training Improves Operating Room Performance
2002 · 2.797 Zit.
An estimation of the global volume of surgery: a modelling strategy based on available data
2008 · 2.510 Zit.
Objective structured assessment of technical skill (OSATS) for surgical residents
1997 · 2.260 Zit.
Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence
2011 · 1.718 Zit.