All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Promoting the Knowledge of Source Syntax in Transformer NMT Is Not Needed

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F19%3A10405528" target="_blank" >RIV/00216208:11320/19:10405528 - isvavai.cz</a>

  • Result on the web

    <a href="https://verso.is.cuni.cz/pub/verso.fpl?fname=obd_publikace_handle&handle=PazElY0WnY" target="_blank" >https://verso.is.cuni.cz/pub/verso.fpl?fname=obd_publikace_handle&handle=PazElY0WnY</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.13053/CyS-23-3-3265" target="_blank" >10.13053/CyS-23-3-3265</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Promoting the Knowledge of Source Syntax in Transformer NMT Is Not Needed

  • Original language description

    The utility of linguistic annotation in neural machine translation seemed to had been established in past papers. The experiments were however limited to recurrent sequence-to-sequence architectures and relatively small data settings. We focus on the state-of-the-art Transformer model and use comparably larger corpora. Specifically, we try to promote the knowledge of source-side syntax using multi-task learning either through simple data manipulation techniques or through a dedicated model component. In particular, we train one of Transformer attention heads to produce source-side dependency tree. Overall, our results cast some doubt on the utility of multi-task setups with linguistic information. The data manipulation techniques, recommended in previous works, prove ineffective in large data settings. The treatment of self-attention as dependencies seems much more promising: it helps in translation and reveals that Transformer model can very easily grasp the syntactic structure. An important but curi

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/GX19-26934X" target="_blank" >GX19-26934X: Neural Representations in Multi-modal and Multi-lingual Modeling</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2019

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Computacion y Sistemas

  • ISSN

    1405-5546

  • e-ISSN

  • Volume of the periodical

    23

  • Issue of the periodical within the volume

    3

  • Country of publishing house

    MX - MEXICO

  • Number of pages

    12

  • Pages from-to

    923-934

  • UT code for WoS article

    000489136900029

  • EID of the result in the Scopus database

    2-s2.0-85076684081