Label Attention Network for Structured Prediction
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F22%3APGQ5CQQE" target="_blank" >RIV/00216208:11320/22:PGQ5CQQE - isvavai.cz</a>
Výsledek na webu
<a href="https://doi.org/10.1109/TASLP.2022.3145311" target="_blank" >https://doi.org/10.1109/TASLP.2022.3145311</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/TASLP.2022.3145311" target="_blank" >10.1109/TASLP.2022.3145311</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Label Attention Network for Structured Prediction
Popis výsledku v původním jazyce
Sequence labeling assigns a label to each token in a sequence, which is a fundamental problem in natural language processing (NLP). Many NLP tasks, including part-of-speech tagging and named entity recognition, can be solved in a form of sequence labeling problem. Other tasks such as constituency parsing and non-autoregressive machine translation can also be transformed into sequence labeling tasks. Neural models have been shown powerful for sequence labeling by employing a multi-layer sequence encoding network. Conditional random field (CRF) is proposed to enrich information over label sequences, yet it suffers large computational complexity and over-reliance on Marko assumption. To this end, we propose label attention network (LAN) to hierarchically refine representation of marginal label distributions bottom-up, enabling higher layers to learn more informed label sequence distribution based on information from lower layers. We demonstrate the effectiveness of LAN through extensive experiments on various NLP tasks including POS tagging, NER, CCG supertagging, constituency parsing and non-autoregressive machine translation. Empirical results show that LAN not only improves the overall tagging accuracy with similar number of parameters, but also significantly speeds up the training and testing compared to CRF.
Název v anglickém jazyce
Label Attention Network for Structured Prediction
Popis výsledku anglicky
Sequence labeling assigns a label to each token in a sequence, which is a fundamental problem in natural language processing (NLP). Many NLP tasks, including part-of-speech tagging and named entity recognition, can be solved in a form of sequence labeling problem. Other tasks such as constituency parsing and non-autoregressive machine translation can also be transformed into sequence labeling tasks. Neural models have been shown powerful for sequence labeling by employing a multi-layer sequence encoding network. Conditional random field (CRF) is proposed to enrich information over label sequences, yet it suffers large computational complexity and over-reliance on Marko assumption. To this end, we propose label attention network (LAN) to hierarchically refine representation of marginal label distributions bottom-up, enabling higher layers to learn more informed label sequence distribution based on information from lower layers. We demonstrate the effectiveness of LAN through extensive experiments on various NLP tasks including POS tagging, NER, CCG supertagging, constituency parsing and non-autoregressive machine translation. Empirical results show that LAN not only improves the overall tagging accuracy with similar number of parameters, but also significantly speeds up the training and testing compared to CRF.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
—
Návaznosti
—
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
IEEE/ACM Transactions on Speech and Language Processing [online]
ISSN
2329-9304
e-ISSN
2329-9304
Svazek periodika
30
Číslo periodika v rámci svazku
2022
Stát vydavatele periodika
US - Spojené státy americké
Počet stran výsledku
14
Strana od-do
1235-1248
Kód UT WoS článku
000777325700002
EID výsledku v databázi Scopus
2-s2.0-85124210855