All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Analog Neuron Hierarchy

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F20%3A00507515" target="_blank" >RIV/67985807:_____/20:00507515 - isvavai.cz</a>

  • Result on the web

    <a href="http://dx.doi.org/10.1016/j.neunet.2020.05.006" target="_blank" >http://dx.doi.org/10.1016/j.neunet.2020.05.006</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1016/j.neunet.2020.05.006" target="_blank" >10.1016/j.neunet.2020.05.006</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Analog Neuron Hierarchy

  • Original language description

    In order to refine the analysis of the computational power of discrete-time recurrent neural networks (NNs) between the binary-state NNs which are equivalent to finite automata (level 3 in the Chomsky hierarchy), and the analog-state NNs with rational weights which are Turing complete (Chomsky level 0), we study an intermediate model alphaANN of a binary-state NN that is extended with alpha >= 0 extra analog-state neurons. For rational weights, we establish an analog neuron hierarchy 0ANNs subset 1ANNs subset 2ANNs subseteq 3ANNs and separate its first two levels. In particular, 0ANNs coincide with the binary-state NNs (Chomsky level 3) being a proper subset of 1ANNs which accept at most context-sensitive languages (Chomsky level 1) including some non-context-free ones (above Chomsky level 2). We prove that the deterministic (context-free) language L_# = { 0^n1^n | n >= 1 } cannot be recognized by any 1ANN even with real weights. In contrast, we show that deterministic pushdown automata accepting deterministic languages can be simulated by 2ANNs with rational weights, which thus constitute a proper superset of 1ANNs. Finally, we prove that the analog neuron hierarchy collapses to 3ANNs by showing that any Turing machine can be simulated by a 3ANN having rational weights, with linear-time overhead.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/GA19-05704S" target="_blank" >GA19-05704S: FoNeCo: Analytical Foundations of Neurocomputing</a><br>

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2020

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Neural Networks

  • ISSN

    0893-6080

  • e-ISSN

  • Volume of the periodical

    128

  • Issue of the periodical within the volume

    August 2020

  • Country of publishing house

    GB - UNITED KINGDOM

  • Number of pages

    17

  • Pages from-to

    199-215

  • UT code for WoS article

    000567812200017

  • EID of the result in the Scopus database

    2-s2.0-85084938909