Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Bone feature quantization and systematized attention gate UNet-based deep learning framework for bone fracture classification
15
Zitationen
5
Autoren
2024
Jahr
Abstract
Bones collaborate with muscles and joints to sustain and maintain our freedom of mobility. The proper musculoskeletal activity of bone protects and strengthens the brain, heart, and lung function. When a bone is subjected to a force greater than its structural capacity, it fractures. Bone fractures should be detected with the appropriate type and should be treated early to avoid acute neurovascular complications. The manual detection of bone fracture may lead to highly delayed complications like malunion, Joint stiffness, Contractures, Myositis ossificans, and Avascular necrosis. A proper classification system must be integrated with deep learning technology to classify bone fractures accurately. This motivates me to propose a Systematized Attention Gate UNet (SAG-UNet) that classifies the type of bone fracture with high accuracy. The main contribution of this research is two-fold. The first contribution focuses on dataset preprocessing through feature extraction using unsupervised learning by adapting the Growing Neural Gas (GNG) method. The second contribution deals with refining the supervised learning Attention UNet model that classifies the ten types of bone fracture. The attention gate of the Attention UNet model is refined and applied to the upsampling decoding layer of Attention UNet. The KAGGLE Bone Break Classification dataset was processed to extract only the essential features using GNG extraction. The quantized significant feature RGB X-ray image was divided into 900 training and 230 testing images in the ratio of 80:20. The training images are fitted with the existing CNN models like DenseNet, VGG, AlexNet, MobileNet, EfficientNet, Inception, Xception, UNet and Attention UNet to choose the best CNN model. Experiment results portray that Attention UNet offers the classification of bone fractures with an accuracy of 89% when testing bone break images. Now, the Attention UNet was chosen to refine the Attention gate of the Decoding upsampling layer that occurs after the encoding layer. The Attention Gate of the proposed SAG-UNet forms the gating coefficient from the input feature map and gate signal. The gating coefficient is then processed with batch normalization that centers the aligned features in the active region, thereby leaving the focus on the unaligned weights of feature maps. Then, the ReLU activation function is applied to introduce the nonlinearity in the aligned features, thereby learning the complex representation in the feature vector. Then, dropout is used to exclude the error noise in the aligned weights of the feature map. Then, 1 <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline" overflow="scroll"> <mml:mo>×</mml:mo> </mml:math> 1 linear convolution transformation was done to form the vector concatenation-based attention feature map. This vector has been applied to the sigmoid activation to create the attention coefficient feature map with weights assigned as ‘1’ for the aligned features. The attention coefficient feature map was grid resampled using trilinear interpolation to form the spatial attention weight map, which is passed to the skip connection of the next decoding layer. The implementation results reveal that the proposed SAG-UNet deep learning model classifies the bone fracture types with a high accuracy of 98.78% compared to the existing deep learning models.
Ähnliche Arbeiten
A survey on deep learning in medical image analysis
2017 · 13.540 Zit.
nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation
2020 · 7.670 Zit.
Calculation of average PSNR differences between RD-curves
2001 · 4.088 Zit.
Magnetic Resonance Classification of Lumbar Intervertebral Disc Degeneration
2001 · 3.888 Zit.
Vertebral fracture assessment using a semiquantitative technique
1993 · 3.605 Zit.