Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Glove: Global Vectors for Word Representation
33.284
Zitationen
3
Autoren
2014
Jahr
Abstract
Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic regularities using vector arithmetic, but the origin of these regularities has remained opaque. We analyze and make explicit the model properties needed for such regularities to emerge in word vectors. The result is a new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods. Our model efficiently leverages statistical information by training only on the nonzero elements in a word-word cooccurrence matrix, rather than on the entire sparse matrix or on individual context windows in a large corpus. The model produces a vector space with meaningful substructure, as evidenced by its performance of 75% on a recent word analogy task. It also outperforms related models on similarity tasks and named entity recognition.
Ähnliche Arbeiten
MizAR 60 for Mizar 50
2023 · 73.965 Zit.
AI-Assisted Pipeline for Dynamic Generation of Trustworthy Health Supplement Content at Scale
2018 · 45.361 Zit.
2019 · 31.363 Zit.
Latent dirichlet allocation
2003 · 26.930 Zit.
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
2014 · 23.811 Zit.