Learning from Data by Neural Networks of Limited Complexity.
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F03%3A06030188" target="_blank" >RIV/67985807:_____/03:06030188 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Learning from Data by Neural Networks of Limited Complexity.
Original language description
Learning from data formalized as a minimization of a relularized empirical error is studied in terms of approximate minimization over sets of functions computable by networks with increasing number of hidden units. There are derived upper bounds on speedof convergence of infima achievable over networks with n hidden inits to the global infimum. The bounds are expressed in terms of norms tailored to the type of network units and moduli of continuity of regularized empirical error functionals.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
BA - General mathematics
OECD FORD branch
—
Result continuities
Project
<a href="/en/project/GA201%2F02%2F0428" target="_blank" >GA201/02/0428: Nonlinear approximation with variable basis and neural networks</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>Z - Vyzkumny zamer (s odkazem do CEZ)
Others
Publication year
2003
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Artificial Neural Networks in Pattern Recognition.
ISBN
—
ISSN
—
e-ISSN
—
Number of pages
6
Pages from-to
146-151
Publisher name
University of Florence
Place of publication
Florence
Event location
Florence [IT]
Event date
Sep 12, 2003
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—