OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 31.03.2026, 19:18

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Rethinking Model Selection for Zero-Shot Biomedical STS: Architecture, Scale, and Similarity Metrics

2025·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

4

Autoren

2025

Jahr

Abstract

Semantic Textual Similarity (STS) is a critical task in biomedical natural language processing, supporting applications such as clinical information retrieval, summarization, and decision support. Although pretrained transformer models are now widely used, the influence of architectural type, model scale, and pretraining domain on STS performance in biomedical contexts remains insufficiently understood. This paper presents a systematic zero-shot evaluation of nine transformer-based models on the BIOSSES benchmark. The models span encoder-only, decoder-only, and encoder-decoder architectures. We compute sentence similarity using both cosine and normalized Euclidean distances, and assess model performance using Pearson and Spearman correlations with expert-annotated gold scores. Our findings show that small, general-domain encoder-only models, such as MiniLM, consistently outperform larger biomedical-specific models like BioBERT. We also observe that the effectiveness of similarity functions varies by model, and that differences between Pearson and Spearman correlations indicate a disconnect between score calibration and ranking precision. These results challenge prevailing assumptions about the superiority of large-scale or domain-specialized models and highlight the importance of architecture-aware model selection. We conclude by outlining directions for future research, including adaptive similarity metrics and evaluation on more diverse biomedical datasets.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Topic ModelingMachine Learning in HealthcareArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen