OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 07.05.2026, 09:45

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Heterogeneous Multi-Task Learning With Expert Diversity

2022·24 Zitationen·IEEE/ACM Transactions on Computational Biology and BioinformaticsOpen Access
Volltext beim Verlag öffnen

24

Zitationen

3

Autoren

2022

Jahr

Abstract

Predicting multiple heterogeneous biological and medical targets is a challenge for traditional deep learning models. In contrast to single-task learning, in which a separate model is trained for each target, multi-task learning (MTL) optimizes a single model to predict multiple related targets simultaneously. To address this challenge, we propose the Multi-gate Mixture-of-Experts with Exclusivity (MMoEEx). Our work aims to tackle the heterogeneous MTL setting, in which the same model optimizes multiple tasks with different characteristics. Such a scenario can overwhelm current MTL approaches due to the challenges in balancing shared and task-specific representations and the need to optimize tasks with competing optimization paths. Our method makes two key contributions: first, we introduce an approach to induce more diversity among experts, thus creating representations more suitable for highly imbalanced and heterogenous MTL learning; second, we adopt a two-step optimization (Finn et al., 2017 and Lee et al., 2020) approach to balancing the tasks at the gradient level. We validate our method on three MTL benchmark datasets, including UCI-Census-income dataset, Medical Information Mart for Intensive Care (MIMIC-III) and PubChem BioAssay (PCBA).

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Domain Adaptation and Few-Shot LearningMachine Learning in HealthcareCOVID-19 diagnosis using AI
Volltext beim Verlag öffnen