What does Chinese BERT learn about syntactic knowledge?
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F23%3A35ITC5RD" target="_blank" >RIV/00216208:11320/23:35ITC5RD - isvavai.cz</a>
Result on the web
<a href="https://peerj.com/articles/cs-1478.pdf" target="_blank" >https://peerj.com/articles/cs-1478.pdf</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.7717/peerj-cs.1478" target="_blank" >10.7717/peerj-cs.1478</a>
Alternative languages
Result language
angličtina
Original language name
What does Chinese BERT learn about syntactic knowledge?
Original language description
"Pre-trained language models such as Bidirectional Encoder Representations fromnTransformers (BERT) have been applied to a wide range of natural language processingn(NLP) tasks and obtained significantly positive results. A growing body of research hasninvestigated the reason why BERT is so efficient and what language knowledge BERTnis able to learn. However, most of these works focused almost exclusively on English.nFew studies have explored the language information, particularly syntactic information,nthat BERT has learned in Chinese, which is written as sequences of characters. In thisnstudy, we adopted some probing methods for identifying syntactic knowledge stored innthe attention heads and hidden states of Chinese BERT."
Czech name
—
Czech description
—
Classification
Type
J<sub>ost</sub> - Miscellaneous article in a specialist periodical
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
—
Others
Publication year
2023
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
"PeerJ Computer Science"
ISSN
2376-5992
e-ISSN
—
Volume of the periodical
9
Issue of the periodical within the volume
2023
Country of publishing house
US - UNITED STATES
Number of pages
22
Pages from-to
1-22
UT code for WoS article
—
EID of the result in the Scopus database
—