ÚFAL at MultiLexNorm 2021: Improving Multilingual Lexical Normalization by Fine-tuning ByT5
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F21%3A10440576" target="_blank" >RIV/00216208:11320/21:10440576 - isvavai.cz</a>
Result on the web
<a href="https://aclanthology.org/2021.wnut-1.54/" target="_blank" >https://aclanthology.org/2021.wnut-1.54/</a>
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
ÚFAL at MultiLexNorm 2021: Improving Multilingual Lexical Normalization by Fine-tuning ByT5
Original language description
We present the winning entry to the Multilingual Lexical Normalization (MultiLexNorm) shared task at W-NUT 2021 (van der Goot et al., 2021a), which evaluates lexical-normalization systems on 12 social media datasets in 11 languages. We base our solution on a pre-trained byte-level language model, ByT5 (Xue et al., 2021a), which we further pre-train on synthetic data and then fine-tune on authentic normalization data. Our system achieves the best performance by a wide margin in intrinsic evaluation, and also the best performance in extrinsic evaluation through dependency parsing. The source code is released at https://github.com/ufal/multilexnorm2021 and the fine-tuned models at https://huggingface.co/ufal.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
<a href="/en/project/LM2018101" target="_blank" >LM2018101: Digital Research Infrastructure for the Language Technologies, Arts and Humanities</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2021
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Proceedings of the 7th Workshop on Noisy User-generated Text (W-NUT 2021)
ISBN
978-1-954085-90-9
ISSN
—
e-ISSN
—
Number of pages
10
Pages from-to
483-492
Publisher name
Association for Computational Linguistics
Place of publication
Stroudsburg, PA, USA
Event location
Online
Event date
Nov 11, 2021
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—