Sparsity and Complexity of Networks Computing Highly-Varying Functions
Result description
Approximative measures of network sparsity in terms of norms tailored to dictionaries of computational units are investigated. Lower bounds on these norms of real-valued functions on finite domains are derived. The bounds are proven by combining the concentration of measure property of high-dimensional spaces with characterization of dictionaries of computational units in terms of their capacities and coherence measured by their covering numbers. The results are applied to dictionaries used in neurocomputing which have power-type covering numbers. Probabilistic results are illustrated by a concrete construction of a class of functions, computation of which by perceptron networks requires large number of units or it is unstable due to large output weights.
Keywords
Shallow and deep networksModel complexitySparsityHighly-varying functionsCovering numbersDictionaries of computational unitsPerceptrons
The result's identifiers
Result code in IS VaVaI
Result on the web
DOI - Digital Object Identifier
Alternative languages
Result language
angličtina
Original language name
Sparsity and Complexity of Networks Computing Highly-Varying Functions
Original language description
Approximative measures of network sparsity in terms of norms tailored to dictionaries of computational units are investigated. Lower bounds on these norms of real-valued functions on finite domains are derived. The bounds are proven by combining the concentration of measure property of high-dimensional spaces with characterization of dictionaries of computational units in terms of their capacities and coherence measured by their covering numbers. The results are applied to dictionaries used in neurocomputing which have power-type covering numbers. Probabilistic results are illustrated by a concrete construction of a class of functions, computation of which by perceptron networks requires large number of units or it is unstable due to large output weights.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
GA18-23827S: Capabilities and limitations of shallow and deep networks
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2018
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Artificial Neural Networks and Machine Learning – ICANN 2018. Proceedings, Part III
ISBN
978-3-030-01423-0
ISSN
0302-9743
e-ISSN
—
Number of pages
10
Pages from-to
534-543
Publisher name
Springer
Place of publication
Cham
Event location
Rhodes
Event date
Oct 4, 2018
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—
Basic information
Result type
D - Article in proceedings
OECD FORD
Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Year of implementation
2018