Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Test-time augmentation for deep learning-based cell segmentation on microscopy images
212
Zitationen
5
Autoren
2020
Jahr
Abstract
Recent advancements in deep learning have revolutionized the way microscopy images of cells are processed. Deep learning network architectures have a large number of parameters, thus, in order to reach high accuracy, they require a massive amount of annotated data. A common way of improving accuracy builds on the artificial increase of the training set by using different augmentation techniques. A less common way relies on test-time augmentation (TTA) which yields transformed versions of the image for prediction and the results are merged. In this paper we describe how we have incorporated the test-time argumentation prediction method into two major segmentation approaches utilized in the single-cell analysis of microscopy images. These approaches are semantic segmentation based on the U-Net, and instance segmentation based on the Mask R-CNN models. Our findings show that even if only simple test-time augmentations (such as rotation or flipping and proper merging methods) are applied, TTA can significantly improve prediction accuracy. We have utilized images of tissue and cell cultures from the Data Science Bowl (DSB) 2018 nuclei segmentation competition and other sources. Additionally, boosting the highest-scoring method of the DSB with TTA, we could further improve prediction accuracy, and our method has reached an ever-best score at the DSB.
Ähnliche Arbeiten
U-Net: Convolutional Networks for Biomedical Image Segmentation
2015 · 87.652 Zit.
Fiji: an open-source platform for biological-image analysis
2012 · 69.753 Zit.
NIH Image to ImageJ: 25 years of image analysis
2012 · 64.303 Zit.
phyloseq: An R Package for Reproducible Interactive Analysis and Graphics of Microbiome Census Data
2013 · 22.106 Zit.
Comprehensive Integration of Single-Cell Data
2019 · 16.568 Zit.