Domain-Weighted Batch Sampling for Neural Dependency Parsing
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F25%3A3RG8BT5L" target="_blank" >RIV/00216208:11320/25:3RG8BT5L - isvavai.cz</a>
Result on the web
<a href="https://aclanthology.org/2024.mwe-1.24" target="_blank" >https://aclanthology.org/2024.mwe-1.24</a>
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Domain-Weighted Batch Sampling for Neural Dependency Parsing
Original language description
In neural dependency parsing, as well as in the broader field of NLP, domain adaptation remains a challenging problem. When adapting a parser to a target domain, there is a fundamental tension between the need to make use of out-of-domain data and the need to ensure that syntactic characteristic of the target domain are learned. In this work we explore a way to balance these two competing concerns, namely using domain-weighted batch sampling, which allows us to use all available training data, while controlling the probability of sampling in- and out-of-domain data when constructing training batches. We conduct experiments using ten natural language domains and find that domain-weighted batch sampling yields substantial performance improvements in all ten domains compared to a baseline of conventional randomized batch sampling.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
—
Others
Publication year
2024
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Proceedings of the Joint Workshop on Multiword Expressions and Universal Dependencies (MWE-UD) @ LREC-COLING 2024
ISBN
978-2-493-81420-3
ISSN
—
e-ISSN
—
Number of pages
9
Pages from-to
198-206
Publisher name
ELRA and ICCL
Place of publication
—
Event location
Torino, Italia
Event date
Jan 1, 2025
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—