OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 01.04.2026, 19:09

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Align procedural expertise with virtual information: the key to AR-navigated ultrasound-guided biopsy

2025·0 Zitationen·Virtual RealityOpen Access
Volltext beim Verlag öffnen

0

Zitationen

11

Autoren

2025

Jahr

Abstract

Ultrasound-guided biopsy is widely used for tumor diagnosis, yet interpreting spatial information from ultrasound images remains challenging. Augmented reality (AR) enables in-situ overlay of imaging and guidance cues, potentially enhancing spatial awareness, improving image interpretation, and facilitating needle navigation. However, its impact on biopsy performance remains inconclusive in previous studies. This research investigates how various factors influence the performance of AR navigation in ultrasound-guided biopsy, aiming to identify crucial considerations for system design. A navigation system was first developed to deliver in-situ ultrasound images and biopsy needle along with various visual cues. Thirty-one participants, representing a range of ultrasound expertise, were recruited to perform 1860 biopsy simulations under varying conditions of visualization content and operational methods. Metrics, including biopsy accuracy, task duration, success rate, and subjective feedback, were recorded for analysis. In brief, experienced interventionists showed minimal benefit (4.39 mm vs. 3.14 mm, 10.64s vs. 9.23s) and reported increased cognitive load during familiar in-plane procedures when overloaded with virtual cues. In contrast, they demonstrated significant performance gains (8.01 mm vs. 2.73 mm, 13.32s vs. 6.77s) and improved user experience in less familiar out-of-plane tasks, even outperforming their in-plane results. These findings highlight the need for user- and task-specific system design considering the procedural expertise.

Ähnliche Arbeiten