Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Learning More with Less
56
Zitationen
8
Autoren
2019
Jahr
Abstract
Accurate Computer-Assisted Diagnosis, associated with proper data wrangling, can alleviate the risk of overlooking the diagnosis in a clinical environment. Towards this, as a Data Augmentation (DA) technique, Generative Adversarial Networks (GANs) can synthesize additional training data to handle the small/fragmented medical imaging datasets collected from various scanners; those images are realistic but completely different from the original ones, filling the data lack in the real image distribution. However, we cannot easily use them to locate disease areas, considering expert physicians' expensive annotation cost. Therefore, this paper proposes Conditional Progressive Growing of GANs (CPGGANs), incorporating highly-rough bounding box conditions incrementally into PGGANs to place brain metastases at desired positions/sizes on 256 X 256 Magnetic Resonance (MR) images, for Convolutional Neural Network-based tumor detection; this first GAN-based medical DA using automatic bounding box annotation improves the training robustness. The results show that CPGGAN-based DA can boost 10% sensitivity in diagnosis with clinically acceptable additional False Positives. Surprisingly, further tumor realism, achieved with additional normal brain MR images for CPGGAN training, does not contribute to detection performance, while even three physicians cannot accurately distinguish them from the real ones in Visual Turing Test.
Ähnliche Arbeiten
Deep Residual Learning for Image Recognition
2016 · 218.745 Zit.
U-Net: Convolutional Networks for Biomedical Image Segmentation
2015 · 87.249 Zit.
ImageNet classification with deep convolutional neural networks
2017 · 75.672 Zit.
Very Deep Convolutional Networks for Large-Scale Image Recognition
2014 · 75.502 Zit.
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
2016 · 53.338 Zit.