OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 19.03.2026, 15:36

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

<i>MTPret</i>: Improving X-Ray Image Analytics With Multitask Pretraining

2024·3 Zitationen·IEEE Transactions on Artificial Intelligence
Volltext beim Verlag öffnen

3

Zitationen

9

Autoren

2024

Jahr

Abstract

While deep neural networks (DNNs) have been widely used in various X-ray image analytics tasks such as classification, segmentation, detection, etc., there frequently needs to collect and annotate a huge amount of training data to train a model for every single task. In this work, we proposed a multi-task self-supervised pre-training strategy MTPret to improve the performance of DNNs in various X-ray analytics tasks. MTPret first trains the backbone to learn visual representations from multiple datasets of different tasks through contrastive learning, then MTPret leverages a multi-task continual learning to learn discriminative features from various downstream tasks. To evaluate the performance of MTPret, we collected eleven X-ray image datasets from different body parts, such as heads, chest, lungs, bones, and etc., for various tasks to pre-train backbones, and fine-tuned the networks on seven of the tasks. The evaluation results on top of the seven tasks showed MTPret outperformed a large number of baseline methods, including other initialization strategies, pre-trained models, and task-specific algorithms in recent studies. In addition, we also performed experiments based on two external tasks, where the datasets of external tasks have not been used in pre-training. The excellent performance of MTPret further confirmed the generalizability and superiority of the proposed multi-task self-supervised pre-training.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

COVID-19 diagnosis using AIRadiomics and Machine Learning in Medical ImagingArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen