Tightness of Upper Bounds on Rates of Neural-Network Approximation.
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F01%3A06010054" target="_blank" >RIV/67985807:_____/01:06010054 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Tightness of Upper Bounds on Rates of Neural-Network Approximation.
Original language description
Tightness of upper bounds on neural network approximation is investigated in the framework of variable-basis approximation. Conditions are given on a variable basis that do not allow a possibility of improving such bounds beyond O(n-(1/2+1/d)), where d is the number of variables of the functions to be approximated. Such conditions are satisfied by Lipschitz sigmoidal perceptrons.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
BA - General mathematics
OECD FORD branch
—
Result continuities
Project
<a href="/en/project/GA201%2F00%2F1489" target="_blank" >GA201/00/1489: Soft Computing: Theoretical foundations and experiments</a><br>
Continuities
Z - Vyzkumny zamer (s odkazem do CEZ)
Others
Publication year
2001
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Artificial Neural Nets and Genetic Algorithms. Proceedings of the International conference.
ISBN
3-211-83651-9
ISSN
—
e-ISSN
—
Number of pages
4
Pages from-to
35-38
Publisher name
Springer
Place of publication
Wien
Event location
Praha [CZ]
Event date
Apr 22, 2001
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—