Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F23%3A3DU6QJIM" target="_blank" >RIV/00216208:11320/23:3DU6QJIM - isvavai.cz</a>
Result on the web
<a href="https://dl.acm.org/doi/10.1145/3605943" target="_blank" >https://dl.acm.org/doi/10.1145/3605943</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1145/3605943" target="_blank" >10.1145/3605943</a>
Alternative languages
Result language
angličtina
Original language name
Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey
Original language description
"Large, pre-trained language models (PLMs) such as BERT and GPT have drastically changed the Natural Language Processing (NLP) field. For numerous NLP tasks, approaches leveraging PLMs have achieved state-of-the-art performance. The key idea is to learn a generic, latent representation of language from a generic task once, then share it across disparate NLP tasks. Language modeling serves as the generic task, one with abundant self-supervised text available for extensive training. This article presents the key fundamental concepts of PLM architectures and a comprehensive view of the shift to PLM-driven NLP techniques. It surveys work applying the pre-training then fine-tuning, prompting, and text generation approaches. In addition, it discusses PLM limitations and suggested directions for future research."
Czech name
—
Czech description
—
Classification
Type
J<sub>ost</sub> - Miscellaneous article in a specialist periodical
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
—
Others
Publication year
2023
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
"ACM Computing Surveys"
ISSN
0360-0300
e-ISSN
—
Volume of the periodical
56
Issue of the periodical within the volume
2
Country of publishing house
US - UNITED STATES
Number of pages
40
Pages from-to
1-40
UT code for WoS article
—
EID of the result in the Scopus database
—