Limitations of Shallow Networks Representing Finite Mappings
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F19%3A00485613" target="_blank" >RIV/67985807:_____/19:00485613 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1007/s00521-018-3680-1" target="_blank" >http://dx.doi.org/10.1007/s00521-018-3680-1</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/s00521-018-3680-1" target="_blank" >10.1007/s00521-018-3680-1</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Limitations of Shallow Networks Representing Finite Mappings
Popis výsledku v původním jazyce
Limitations of capabilities of shallow networks to efficiently compute real-valued functions on finite domains are investigated. Efficiency is studied in terms of network sparsity and its approximate measures. It is shown that when a dictionary of computational units is not sufficiently large, computation of almost any uniformly randomly chosen function either represents a well-conditioned task performed by a large network or an ill-conditioned task performed by a network of a moderate size. The probabilistic results are complemented by a concrete example of a class of functions which cannot be efficiently computed by shallow perceptron networks. The class is constructed using pseudo-noise sequences which have many features of random sequences but can be generated using special polynomials. Connections to the No Free Lunch Theorem and the central paradox of coding theory are discussed.
Název v anglickém jazyce
Limitations of Shallow Networks Representing Finite Mappings
Popis výsledku anglicky
Limitations of capabilities of shallow networks to efficiently compute real-valued functions on finite domains are investigated. Efficiency is studied in terms of network sparsity and its approximate measures. It is shown that when a dictionary of computational units is not sufficiently large, computation of almost any uniformly randomly chosen function either represents a well-conditioned task performed by a large network or an ill-conditioned task performed by a network of a moderate size. The probabilistic results are complemented by a concrete example of a class of functions which cannot be efficiently computed by shallow perceptron networks. The class is constructed using pseudo-noise sequences which have many features of random sequences but can be generated using special polynomials. Connections to the No Free Lunch Theorem and the central paradox of coding theory are discussed.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
Výsledek vznikl pri realizaci vícero projektů. Více informací v záložce Projekty.
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2019
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Neural Computing & Applications
ISSN
0941-0643
e-ISSN
—
Svazek periodika
31
Číslo periodika v rámci svazku
6
Stát vydavatele periodika
US - Spojené státy americké
Počet stran výsledku
10
Strana od-do
1783-1792
Kód UT WoS článku
000470746700008
EID výsledku v databázi Scopus
2-s2.0-85052492938