Strategies for Training Large Scale Neural Network Language Models
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F11%3APU96169" target="_blank" >RIV/00216305:26230/11:PU96169 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Strategies for Training Large Scale Neural Network Language Models
Original language description
Techniques for effective training of recurrent neural network based language models are described, and new state-of-the-art results on standard speech recognition task are reported.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
JC - Computer hardware and software
OECD FORD branch
—
Result continuities
Project
Result was created during the realization of more than one project. More information in the Projects tab.
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2011
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Proceedings of ASRU 2011
ISBN
978-1-4673-0366-8
ISSN
—
e-ISSN
—
Number of pages
6
Pages from-to
196-201
Publisher name
IEEE Signal Processing Society
Place of publication
Hilton Waikoloa Village, Big Island, Hawaii
Event location
Hilton Waikoloa Village Resort, Big Island, Hawa
Event date
Dec 11, 2011
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—