OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 14.03.2026, 11:48

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

PD46-03 IS CROWDSOURCING SURGICAL SKILL ASSESSMENT RELIABLE? AN ANALYSIS OF ROBOTIC PROSTATECTOMIES

2017·1 Zitationen·The Journal of Urology
Volltext beim Verlag öffnen

1

Zitationen

6

Autoren

2017

Jahr

Abstract

You have accessJournal of UrologySurgical Technology & Simulation: Training & Skills Assessment III1 Apr 2017PD46-03 IS CROWDSOURCING SURGICAL SKILL ASSESSMENT RELIABLE? AN ANALYSIS OF ROBOTIC PROSTATECTOMIES Thomas S. Lendvay, Khurshid R. Ghani, James O. Peabody, Susan Linsell, David C. Miller, and Bryan Comstock Thomas S. LendvayThomas S. Lendvay More articles by this author , Khurshid R. GhaniKhurshid R. Ghani More articles by this author , James O. PeabodyJames O. Peabody More articles by this author , Susan LinsellSusan Linsell More articles by this author , David C. MillerDavid C. Miller More articles by this author , and Bryan ComstockBryan Comstock More articles by this author View All Author Informationhttps://doi.org/10.1016/j.juro.2017.02.2375AboutPDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareFacebookTwitterLinked InEmail INTRODUCTION AND OBJECTIVES Crowdsourcing has demonstrated the ability to provide accurate surgical skills assessments correlating with expert surgeon reviewers. We studied whether crowdsourcing skills assessment of robotic prostatectomy performances would yield reliable scoring across a range of days and times of day. We also sought to characterize the agreement of video review among peer robotic surgeons reviewing the same urethrovesical anastomosis videos. METHODS We used five urethrovesical anastomosis videos previously assessed by faculty-level surgeons within the Michigan Urological Surgery Improvement Collaborative (MUSIC) using the Global Evaluative Assessment of Robotic Skills (GEARS), (highest score 25). The bottom and top scoring videos and three videos evenly distributed across a range of peer rater scores were selected from a larger pool of videos. Each video was assessed through the C-SATS platform (C-SATS, Inc. Seattle, WA) by n=32 random crowdworkers at one of ten different days/times over one week. A C-SATS GEARS average score was generated for each video; reliability was assessed using intraclass correlation coefficient (ICC). We then evaluated the 5 anastomosis videos with n=23 faculty experts as a comparative example of expert rating performance. Experts were not subjected to the same repeated trials of reviews. Expert reviewers saw each performance once and responded within 14 days of the review process. Ten different groups of crowds reviewed each video over the course of the week. RESULTS A total of 342 unique crowdworkers provided 1,640 ratings in a median completion time of 1 hour and 22 minutes for each of the ten review sessions. The C-SATS ICC was found to be 0.92 (95% CI: 0.79 to 0.99), indicating a very high level of reliability of the C-SATS rating process (Figure 1a). Reliability was lower for the 23 faculty experts (ICC=0.68; 95% CI: 0.42 to 0.95), with the range of expert scores spanning more than half of the GEARS scale on each video (Figure 1b). CONCLUSIONS We demonstrated that crowdsourcing to assess technical skills is repeatable and reliable across multiple times of the week providing evidence that such a method could be used to assess the skill of surgeons. Furthermore, expert video review could potentially be enhanced through repeated trials with workshops to build consensus on scoring standardization. © 2017FiguresReferencesRelatedDetails Volume 197Issue 4SApril 2017Page: e890-e891 Advertisement Copyright & Permissions© 2017MetricsAuthor Information Thomas S. Lendvay More articles by this author Khurshid R. Ghani More articles by this author James O. Peabody More articles by this author Susan Linsell More articles by this author David C. Miller More articles by this author Bryan Comstock More articles by this author Expand All Advertisement Advertisement PDF downloadLoading ...

Ähnliche Arbeiten