Extracting Syntactic Trees from Transformer Encoder Self-Attentions
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F18%3A10390185" target="_blank" >RIV/00216208:11320/18:10390185 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Extracting Syntactic Trees from Transformer Encoder Self-Attentions
Original language description
Extracting Syntactic Trees from Transformer Encoder Self-Attentions. Extracting Syntactic Trees from Transformer Encoder Self-Attentions.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
<a href="/en/project/GA18-02196S" target="_blank" >GA18-02196S: Linguistic Structure Representation in Neural Networks</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2018
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Proceedings of the First Workshop on Analyzing and Interpreting Neural Networks for NLP
ISBN
978-1-948087-71-1
ISSN
—
e-ISSN
neuvedeno
Number of pages
3
Pages from-to
347-349
Publisher name
The Assotiation of Computational Linguistics
Place of publication
Stroudsburg, PA, USA
Event location
Bruxelles, Belgium
Event date
Nov 1, 2018
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—