All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Energy-Time Tradeoff in Recurrent Neural Nets

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F15%3A00472477" target="_blank" >RIV/67985807:_____/15:00472477 - isvavai.cz</a>

  • Result on the web

    <a href="http://dx.doi.org/10.1007/978-3-319-09903-3_3" target="_blank" >http://dx.doi.org/10.1007/978-3-319-09903-3_3</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1007/978-3-319-09903-3_3" target="_blank" >10.1007/978-3-319-09903-3_3</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Energy-Time Tradeoff in Recurrent Neural Nets

  • Original language description

    In this chapter, we deal with the energy complexity of perceptron networks which has been inspired by the fact that the activity of neurons in the brain is quite sparse (with only about 1% of neurons firing). This complexity measure has recently been introduced for feedforward architectures (i.e., threshold circuits). We shortly survey the tradeoff results which relate the energy to other complexity measures such as the size and depth of threshold circuits. We generalize the energy complexity for recurrent architectures which counts the number of simultaneously active neurons at any time instant of a computation. We present our energy-time tradeoff result for the recurrent neural nets which are known to be computationally as powerful as the finite automata. In particular, we show the main ideas of simulating any deterministic finite automaton by a low-energy optimal-size neural network. In addition, we present a lower bound on the energy of such a simulation (within a certain range of time overhead) which implies that the energy demands in a fixedsize network increase exponentially with the frequency of presenting the input bits.

  • Czech name

  • Czech description

Classification

  • Type

    D - Article in proceedings

  • CEP classification

    IN - Informatics

  • OECD FORD branch

Result continuities

  • Project

    <a href="/en/project/GBP202%2F12%2FG061" target="_blank" >GBP202/12/G061: Center of excellence - Institute for theoretical computer science (CE-ITI)</a><br>

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2015

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Article name in the collection

    Artificial Neural Networks. Methods and Applications in Bio-/Neuroinformatics

  • ISBN

    978-3-319-09902-6

  • ISSN

    2193-9349

  • e-ISSN

  • Number of pages

    12

  • Pages from-to

    51-62

  • Publisher name

    Springer

  • Place of publication

    Cham

  • Event location

    Sofia

  • Event date

    Sep 10, 2013

  • Type of event by nationality

    WRD - Celosvětová akce

  • UT code for WoS article

    000380528700003