From Balustrades to Pierre Vinken: Looking for Syntax in Transformer Self-Attentions
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F19%3A10405580" target="_blank" >RIV/00216208:11320/19:10405580 - isvavai.cz</a>
Result on the web
<a href="https://www.aclweb.org/anthology/W19-4827" target="_blank" >https://www.aclweb.org/anthology/W19-4827</a>
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
From Balustrades to Pierre Vinken: Looking for Syntax in Transformer Self-Attentions
Original language description
We inspect the multi-head self-attention in Transformer NMT encoders for three source languages, looking for patterns that could have a syntactic interpretation. In many of the attention heads, we frequently find sequences of consecutive states attending to the same position, which resemble syntactic phrases. We propose a transparent deterministic method of quantifying the amount of syntactic information present in the self-attentions, based on automatically building and evaluating phrase-structure trees from the phrase-like sequences. We compare the resulting trees to existing constituency treebanks, both manually and by computing precision and recall.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
<a href="/en/project/GA18-02196S" target="_blank" >GA18-02196S: Linguistic Structure Representation in Neural Networks</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2019
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
The BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP at ACL 2019
ISBN
978-1-950737-30-7
ISSN
—
e-ISSN
—
Number of pages
13
Pages from-to
263-275
Publisher name
Association for Computational Linguistics
Place of publication
Stroudsburg, PA, USA
Event location
Firenze, Italy
Event date
Aug 1, 2019
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—