When FastText Pays Attention: Efficient Estimation of Word Representations using Constrained Positional Weighting
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216224%3A14330%2F22%3A00124923" target="_blank" >RIV/00216224:14330/22:00124923 - isvavai.cz</a>
Výsledek na webu
<a href="https://doi.org/10.3897/jucs.69619" target="_blank" >https://doi.org/10.3897/jucs.69619</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.3897/jucs.69619" target="_blank" >10.3897/jucs.69619</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
When FastText Pays Attention: Efficient Estimation of Word Representations using Constrained Positional Weighting
Popis výsledku v původním jazyce
In 2018, Mikolov et al. introduced the positional language model, which has characteristics of attention-based neural machine translation models and which achieved state-of-the-art performance on the intrinsic word analogy task. However, the positional model is not practically fast and it has never been evaluated on qualitative criteria or extrinsic tasks. We propose a constrained positional model, which adapts the sparse attention mechanism from neural machine translation to improve the speed of the positional model. We evaluate the positional and constrained positional models on three novel qualitative criteria and on language modeling. We show that the positional and constrained positional models contain interpretable information about the grammatical properties of words and outperform other shallow models on language modeling. We also show that our constrained model outperforms the positional model on language modeling and trains twice as fast.
Název v anglickém jazyce
When FastText Pays Attention: Efficient Estimation of Word Representations using Constrained Positional Weighting
Popis výsledku anglicky
In 2018, Mikolov et al. introduced the positional language model, which has characteristics of attention-based neural machine translation models and which achieved state-of-the-art performance on the intrinsic word analogy task. However, the positional model is not practically fast and it has never been evaluated on qualitative criteria or extrinsic tasks. We propose a constrained positional model, which adapts the sparse attention mechanism from neural machine translation to improve the speed of the positional model. We evaluate the positional and constrained positional models on three novel qualitative criteria and on language modeling. We show that the positional and constrained positional models contain interpretable information about the grammatical properties of words and outperform other shallow models on language modeling. We also show that our constrained model outperforms the positional model on language modeling and trains twice as fast.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
—
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Journal of Universal Computer Science
ISSN
0948-695X
e-ISSN
0948-6968
Svazek periodika
28
Číslo periodika v rámci svazku
2
Stát vydavatele periodika
CZ - Česká republika
Počet stran výsledku
21
Strana od-do
181-201
Kód UT WoS článku
000767374300005
EID výsledku v databázi Scopus
2-s2.0-85127775769