Transferability of Syntax-Aware Graph Neural Networks in Zero-Shot Cross-Lingual Semantic Role Labeling
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F25%3ACKW6C8UD" target="_blank" >RIV/00216208:11320/25:CKW6C8UD - isvavai.cz</a>
Výsledek na webu
<a href="https://www.scopus.com/inward/record.uri?eid=2-s2.0-85217623093&partnerID=40&md5=5f7ed0d162a62107b223ccca9faac10a" target="_blank" >https://www.scopus.com/inward/record.uri?eid=2-s2.0-85217623093&partnerID=40&md5=5f7ed0d162a62107b223ccca9faac10a</a>
DOI - Digital Object Identifier
—
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Transferability of Syntax-Aware Graph Neural Networks in Zero-Shot Cross-Lingual Semantic Role Labeling
Popis výsledku v původním jazyce
Recent models in cross-lingual semantic role labeling (SRL) barely analyze the applicability of their network selection. We believe that network selection is important since it affects the transferability of cross-lingual models, i.e., how the model can extract universal features from source languages to label target languages. Therefore, we comprehensively compare the transferability of different graph neural network (GNN)-based models enriched with universal dependency trees. GNN-based models include transformer-based, graph convolutional network-based, and graph attention network (GAT)-based models. We focus our study on a zero-shot setting by training the models in English and evaluating the models in 23 target languages provided by the Universal Proposition Bank. Based on our experiments, we consistently show that syntax from universal dependency trees is essential for cross-lingual SRL models to achieve better transferability. Dependency-aware self-attention with relative position representations (SAN-RPRs) transfer best across languages, especially in the long-range dependency distance. We also show that dependency-aware two-attention relational GATs transfer better than SAN-RPRs in languages where most arguments lie in a 1-2 dependency distance. © 2024 Association for Computational Linguistics.
Název v anglickém jazyce
Transferability of Syntax-Aware Graph Neural Networks in Zero-Shot Cross-Lingual Semantic Role Labeling
Popis výsledku anglicky
Recent models in cross-lingual semantic role labeling (SRL) barely analyze the applicability of their network selection. We believe that network selection is important since it affects the transferability of cross-lingual models, i.e., how the model can extract universal features from source languages to label target languages. Therefore, we comprehensively compare the transferability of different graph neural network (GNN)-based models enriched with universal dependency trees. GNN-based models include transformer-based, graph convolutional network-based, and graph attention network (GAT)-based models. We focus our study on a zero-shot setting by training the models in English and evaluating the models in 23 target languages provided by the Universal Proposition Bank. Based on our experiments, we consistently show that syntax from universal dependency trees is essential for cross-lingual SRL models to achieve better transferability. Dependency-aware self-attention with relative position representations (SAN-RPRs) transfer best across languages, especially in the long-range dependency distance. We also show that dependency-aware two-attention relational GATs transfer better than SAN-RPRs in languages where most arguments lie in a 1-2 dependency distance. © 2024 Association for Computational Linguistics.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
—
Návaznosti
—
Ostatní
Rok uplatnění
2024
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
EMNLP - Conf. Empir. Methods Nat. Lang. Process., Find. EMNLP
ISBN
979-889176168-1
ISSN
—
e-ISSN
—
Počet stran výsledku
23
Strana od-do
20-42
Název nakladatele
Association for Computational Linguistics (ACL)
Místo vydání
—
Místo konání akce
Miami
Datum konání akce
1. 1. 2025
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—