Improving Word meaning representations using Wikipedia categories
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F49777513%3A23520%2F18%3A43955049" target="_blank" >RIV/49777513:23520/18:43955049 - isvavai.cz</a>
Result on the web
<a href="http://hdl.handle.net/11025/34807" target="_blank" >http://hdl.handle.net/11025/34807</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.14311/NNW.2018.28.029" target="_blank" >10.14311/NNW.2018.28.029</a>
Alternative languages
Result language
angličtina
Original language name
Improving Word meaning representations using Wikipedia categories
Original language description
In this paper we extend Skip-Gram and Continuous Bag-of-Words Distributional word representations models via global context information. We use a corpus extracted from Wikipedia, where articles are organized in a hierarchy of categories. These categories provide useful topical information about each article. We present the four new approaches, how to enrich word meaning representation with such information. We experiment with the English Wikipedia and evaluate our models on standard word similarity and word analogy datasets. Proposed models significantly outperform other word representation methods when similar size training data of similar size is used and provide similar performance compared with methods trained on much larger datasets. Our new approach shows, that increasing the amount of unlabelled data does not necessarily increase the performance of word embeddings as much as introducing the global or sub-word information, especially when training time is taken into the consideration.
Czech name
—
Czech description
—
Classification
Type
J<sub>SC</sub> - Article in a specialist periodical, which is included in the SCOPUS database
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2018
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Neural Network World
ISSN
1210-0552
e-ISSN
—
Volume of the periodical
28
Issue of the periodical within the volume
6
Country of publishing house
CZ - CZECH REPUBLIC
Number of pages
12
Pages from-to
523-534
UT code for WoS article
—
EID of the result in the Scopus database
2-s2.0-85061489302