Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Microvessel prediction in H&E Stained Pathology Images using fully convolutional neural networks
59
Zitationen
7
Autoren
2018
Jahr
Abstract
BACKGROUND: Pathological angiogenesis has been identified in many malignancies as a potential prognostic factor and target for therapy. In most cases, angiogenic analysis is based on the measurement of microvessel density (MVD) detected by immunostaining of CD31 or CD34. However, most retrievable public data is generally composed of Hematoxylin and Eosin (H&E)-stained pathology images, for which is difficult to get the corresponding immunohistochemistry images. The role of microvessels in H&E stained images has not been widely studied due to their complexity and heterogeneity. Furthermore, identifying microvessels manually for study is a labor-intensive task for pathologists, with high inter- and intra-observer variation. Therefore, it is important to develop automated microvessel-detection algorithms in H&E stained pathology images for clinical association analysis. RESULTS: In this paper, we propose a microvessel prediction method using fully convolutional neural networks. The feasibility of our proposed algorithm is demonstrated through experimental results on H&E stained images. Furthermore, the identified microvessel features were significantly associated with the patient clinical outcomes. CONCLUSIONS: This is the first study to develop an algorithm for automated microvessel detection in H&E stained pathology images.
Ähnliche Arbeiten
A survey on deep learning in medical image analysis
2017 · 13.879 Zit.
pROC: an open-source package for R and S+ to analyze and compare ROC curves
2011 · 13.750 Zit.
Dermatologist-level classification of skin cancer with deep neural networks
2017 · 13.439 Zit.
A survey on Image Data Augmentation for Deep Learning
2019 · 12.032 Zit.
QuPath: Open source software for digital pathology image analysis
2017 · 8.378 Zit.