OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 06.04.2026, 02:46

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Surgical Activity Recognition in Robot-Assisted Radical Prostatectomy\n using Deep Learning

2018·0 Zitationen·arXiv (Cornell University)Open Access
Volltext beim Verlag öffnen

0

Zitationen

4

Autoren

2018

Jahr

Abstract

Adverse surgical outcomes are costly to patients and hospitals. Approaches to\nbenchmark surgical care are often limited to gross measures across the entire\nprocedure despite the performance of particular tasks being largely responsible\nfor undesirable outcomes. In order to produce metrics from tasks as opposed to\nthe whole procedure, methods to recognize automatically individual surgical\ntasks are needed. In this paper, we propose several approaches to recognize\nsurgical activities in robot-assisted minimally invasive surgery using deep\nlearning. We collected a clinical dataset of 100 robot-assisted radical\nprostatectomies (RARP) with 12 tasks each and propose `RP-Net', a modified\nversion of InceptionV3 model, for image based surgical activity recognition. We\nachieve an average precision of 80.9% and average recall of 76.7% across all\ntasks using RP-Net which out-performs all other RNN and CNN based models\nexplored in this paper. Our results suggest that automatic surgical activity\nrecognition during RARP is feasible and can be the foundation for advanced\nanalytics.\n

Ähnliche Arbeiten

Autoren

Themen

Surgical Simulation and TrainingCardiac, Anesthesia and Surgical OutcomesArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen