Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Reproducibility and explainability in digital pathology: The need to make black-box artificial intelligence systems more transparent
1
Zitationen
3
Autoren
2024
Jahr
Abstract
Artificial intelligence (AI), and more specifically Machine Learning (ML) and Deep learning (DL), has permeated the digital pathology field in recent years, with many algorithms successfully applied as new advanced tools to analyze pathological tissues. The introduction of high-resolution scanners in histopathology services has represented a real revolution for pathologists, allowing the analysis of digital whole-slide images (WSI) on a screen without a microscope at hand. However, it means a transition from microscope to algorithms in the absence of specific training for most pathologists involved in clinical practice. The WSI approach represents a major transformation, even from a computational point of view. The multiple ML and DL tools specifically developed for WSI analysis may enhance the diagnostic process in many fields of human pathology. AI-driven models allow the achievement of more consistent results, providing valid support for detecting, from H&E-stained sections, multiple biomarkers, including microsatellite instability, that are missed by expert pathologists.
Ähnliche Arbeiten
A survey on deep learning in medical image analysis
2017 · 13.563 Zit.
Dermatologist-level classification of skin cancer with deep neural networks
2017 · 13.184 Zit.
A survey on Image Data Augmentation for Deep Learning
2019 · 11.792 Zit.
QuPath: Open source software for digital pathology image analysis
2017 · 8.171 Zit.
Radiomics: Images Are More than Pictures, They Are Data
2015 · 8.010 Zit.