All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Sequence Length is a Domain: Length-based Overfitting in Transformer Models

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F21%3A10440583" target="_blank" >RIV/00216208:11320/21:10440583 - isvavai.cz</a>

  • Result on the web

    <a href="https://aclanthology.org/2021.emnlp-main.650.pdf" target="_blank" >https://aclanthology.org/2021.emnlp-main.650.pdf</a>

  • DOI - Digital Object Identifier

Alternative languages

  • Result language

    angličtina

  • Original language name

    Sequence Length is a Domain: Length-based Overfitting in Transformer Models

  • Original language description

    Transformer-based sequence-to-sequence architectures, while achieving state-of-the-art results on a large number of NLP tasks, can still suffer from overfitting during training. In practice, this is usually countered either by applying regularization methods (e.g. dropout, L2-regularization) or by providing huge amounts of training data. Additionally, Transformer and other architectures are known to struggle when generating very long sequences. For example, in machine translation, the neural-based systems perform worse on very long sequences when compared to the preceding phrase-based translation approaches (Koehn and Knowles, 2017). We present results which suggest that the issue might also be in the mismatch between the length distributions of the training and validation data combined with the aforementioned tendency of the neural networks to overfit to the training data. We demonstrate on a simple string editing task and a machine translation task that the Transformer model performance drops signif

  • Czech name

  • Czech description

Classification

  • Type

    D - Article in proceedings

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/GX19-26934X" target="_blank" >GX19-26934X: Neural Representations in Multi-modal and Multi-lingual Modeling</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2021

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Article name in the collection

    Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP)

  • ISBN

    978-1-955917-09-4

  • ISSN

  • e-ISSN

  • Number of pages

    12

  • Pages from-to

    8246-8257

  • Publisher name

    Association for Computational Linguistics

  • Place of publication

    Stroudsburg, PA, USA

  • Event location

    Punta Cana, Dominican Republic

  • Event date

    Nov 7, 2021

  • Type of event by nationality

    WRD - Celosvětová akce

  • UT code for WoS article