All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Limitations of Shallow Networks Representing Finite Mappings

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F19%3A00485613" target="_blank" >RIV/67985807:_____/19:00485613 - isvavai.cz</a>

  • Result on the web

    <a href="http://dx.doi.org/10.1007/s00521-018-3680-1" target="_blank" >http://dx.doi.org/10.1007/s00521-018-3680-1</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1007/s00521-018-3680-1" target="_blank" >10.1007/s00521-018-3680-1</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Limitations of Shallow Networks Representing Finite Mappings

  • Original language description

    Limitations of capabilities of shallow networks to efficiently compute real-valued functions on finite domains are investigated. Efficiency is studied in terms of network sparsity and its approximate measures. It is shown that when a dictionary of computational units is not sufficiently large, computation of almost any uniformly randomly chosen function either represents a well-conditioned task performed by a large network or an ill-conditioned task performed by a network of a moderate size. The probabilistic results are complemented by a concrete example of a class of functions which cannot be efficiently computed by shallow perceptron networks. The class is constructed using pseudo-noise sequences which have many features of random sequences but can be generated using special polynomials. Connections to the No Free Lunch Theorem and the central paradox of coding theory are discussed.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    Result was created during the realization of more than one project. More information in the Projects tab.

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2019

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Neural Computing & Applications

  • ISSN

    0941-0643

  • e-ISSN

  • Volume of the periodical

    31

  • Issue of the periodical within the volume

    6

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    10

  • Pages from-to

    1783-1792

  • UT code for WoS article

    000470746700008

  • EID of the result in the Scopus database

    2-s2.0-85052492938