OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 23.03.2026, 14:58

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

In-BoXBART: Get Instructions into Biomedical Multi-Task Learning

2022·18 Zitationen·Findings of the Association for Computational Linguistics: NAACL 2022Open Access
Volltext beim Verlag öffnen

18

Zitationen

6

Autoren

2022

Jahr

Abstract

Single-task models have proven pivotal in solving specific tasks; however, they have limitations in real-world applications where multitasking is necessary and domain shifts are exhibited. Recently, instructional prompts have shown significant improvement towards multitask generalization; however, the effect of instructional prompts and Multi-Task Learning (MTL) has not been systematically studied in the biomedical domain. Motivated by this, this paper explores the impact of instructional prompts for biomedical MTL. We introduce the BoX, a collection of 32 instruction tasks for Biomedical NLP across (X) various categories. Using this meta-dataset, we propose a unified model termed as In-BoXBART, that can jointly learn all tasks of the BoX without any task-specific modules. To the best of our knowledge, this is the first attempt to propose a unified model in the biomedical domain and use instructions to achieve generalization across several biomedical tasks. Experimental results indicate that the proposed model: 1) outperforms single-task baseline by 3% and multitask (without instruction) baseline by 18% on an average, and 2) shows 23% improvement compared to single-task baseline in few-shot learning (i.e., 32 instances per task) on an average. Our analysis indicates that there is significant room for improvement across tasks in the BoX, implying the scope for future research direction. 1

Ähnliche Arbeiten