Neural Networks Learning as Approximate Optimization.
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F03%3A06030183" target="_blank" >RIV/67985807:_____/03:06030183 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Neural Networks Learning as Approximate Optimization.
Original language description
Learning from data will be studied in the framework of approximate minimization of regularized empirical error functionals. There will be derived estimates of speed of convergence of infima achievable over approximations of an admissible set to a globalinfimum. The results will be applied to empirical error functionals regularized using stabilizers defined as squares of norms in reproducing kernel Hilbert spaces.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
BA - General mathematics
OECD FORD branch
—
Result continuities
Project
<a href="/en/project/GA201%2F02%2F0428" target="_blank" >GA201/02/0428: Nonlinear approximation with variable basis and neural networks</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>Z - Vyzkumny zamer (s odkazem do CEZ)
Others
Publication year
2003
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Artificial Neural Nets and Genetic Algorithms.
ISBN
3-211-00743-1
ISSN
—
e-ISSN
—
Number of pages
5
Pages from-to
53-57
Publisher name
Springer-Verlag
Place of publication
Wien
Event location
Roanne [FR]
Event date
Apr 23, 2003
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—