Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction
2018·30 ZitationenOpen Access
Volltext beim Verlag öffnen30
Zitationen
6
Autoren
2018
Jahr
Abstract
Neural network models are oftentimes restricted by limited labeled instances and resort to advanced architectures and features for cutting edge performance. We propose to build a recurrent neural network with multiple semantically heterogeneous embeddings within a self-training framework. Our framework makes use of labeled, unlabeled, and social media data, operates on basic features, and is scalable and generalizable. With this method, we establish the state-of-the-art result for both in-and cross-domain for a clinical temporal relation extraction task.
Ähnliche Arbeiten
Autoren
Institutionen
Themen
Topic ModelingMachine Learning in HealthcareNatural Language Processing Techniques