Energy-Time Tradeoff in Recurrent Neural Nets
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F15%3A00472477" target="_blank" >RIV/67985807:_____/15:00472477 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1007/978-3-319-09903-3_3" target="_blank" >http://dx.doi.org/10.1007/978-3-319-09903-3_3</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/978-3-319-09903-3_3" target="_blank" >10.1007/978-3-319-09903-3_3</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Energy-Time Tradeoff in Recurrent Neural Nets
Popis výsledku v původním jazyce
In this chapter, we deal with the energy complexity of perceptron networks which has been inspired by the fact that the activity of neurons in the brain is quite sparse (with only about 1% of neurons firing). This complexity measure has recently been introduced for feedforward architectures (i.e., threshold circuits). We shortly survey the tradeoff results which relate the energy to other complexity measures such as the size and depth of threshold circuits. We generalize the energy complexity for recurrent architectures which counts the number of simultaneously active neurons at any time instant of a computation. We present our energy-time tradeoff result for the recurrent neural nets which are known to be computationally as powerful as the finite automata. In particular, we show the main ideas of simulating any deterministic finite automaton by a low-energy optimal-size neural network. In addition, we present a lower bound on the energy of such a simulation (within a certain range of time overhead) which implies that the energy demands in a fixedsize network increase exponentially with the frequency of presenting the input bits.
Název v anglickém jazyce
Energy-Time Tradeoff in Recurrent Neural Nets
Popis výsledku anglicky
In this chapter, we deal with the energy complexity of perceptron networks which has been inspired by the fact that the activity of neurons in the brain is quite sparse (with only about 1% of neurons firing). This complexity measure has recently been introduced for feedforward architectures (i.e., threshold circuits). We shortly survey the tradeoff results which relate the energy to other complexity measures such as the size and depth of threshold circuits. We generalize the energy complexity for recurrent architectures which counts the number of simultaneously active neurons at any time instant of a computation. We present our energy-time tradeoff result for the recurrent neural nets which are known to be computationally as powerful as the finite automata. In particular, we show the main ideas of simulating any deterministic finite automaton by a low-energy optimal-size neural network. In addition, we present a lower bound on the energy of such a simulation (within a certain range of time overhead) which implies that the energy demands in a fixedsize network increase exponentially with the frequency of presenting the input bits.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
IN - Informatika
OECD FORD obor
—
Návaznosti výsledku
Projekt
<a href="/cs/project/GBP202%2F12%2FG061" target="_blank" >GBP202/12/G061: Centrum excelence - Institut teoretické informatiky (CE-ITI)</a><br>
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2015
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Artificial Neural Networks. Methods and Applications in Bio-/Neuroinformatics
ISBN
978-3-319-09902-6
ISSN
2193-9349
e-ISSN
—
Počet stran výsledku
12
Strana od-do
51-62
Název nakladatele
Springer
Místo vydání
Cham
Místo konání akce
Sofia
Datum konání akce
10. 9. 2013
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000380528700003