Energy Complexity Model for Convolutional Neural Networks
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F23%3A00573373" target="_blank" >RIV/67985807:_____/23:00573373 - isvavai.cz</a>
Nalezeny alternativní kódy
RIV/00216305:26230/23:PU149415
Výsledek na webu
<a href="https://dx.doi.org/10.1007/978-3-031-44204-9_16" target="_blank" >https://dx.doi.org/10.1007/978-3-031-44204-9_16</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/978-3-031-44204-9_16" target="_blank" >10.1007/978-3-031-44204-9_16</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Energy Complexity Model for Convolutional Neural Networks
Popis výsledku v původním jazyce
The energy efficiency of hardware implementations of convolutional neural networks (CNNs) is critical to their widespread deployment in low-power mobile devices. Recently, a plethora of methods have been proposed providing energy-optimal mappings of CNNs onto diverse hardware accelerators. Their estimated power consumption is related to specific implementation details and hardware parameters, which does not allow for machine-independent exploration of CNN energy measures. In this paper, we introduce a simplified theoretical energy complexity model for CNNs, based on only two-level memory hierarchy that captures asymptotically all important sources of power consumption of different CNN hardware implementations. We calculate energy complexity in this model for two common dataflows which, according to statistical tests, fits asymptotically very well the power consumption estimated by the Time/Accelergy program for convolutional layers on the Simba and Eyeriss hardware platforms. The model opens the possibility of proving principal limits on the energy efficiency of CNN hardware accelerators.
Název v anglickém jazyce
Energy Complexity Model for Convolutional Neural Networks
Popis výsledku anglicky
The energy efficiency of hardware implementations of convolutional neural networks (CNNs) is critical to their widespread deployment in low-power mobile devices. Recently, a plethora of methods have been proposed providing energy-optimal mappings of CNNs onto diverse hardware accelerators. Their estimated power consumption is related to specific implementation details and hardware parameters, which does not allow for machine-independent exploration of CNN energy measures. In this paper, we introduce a simplified theoretical energy complexity model for CNNs, based on only two-level memory hierarchy that captures asymptotically all important sources of power consumption of different CNN hardware implementations. We calculate energy complexity in this model for two common dataflows which, according to statistical tests, fits asymptotically very well the power consumption estimated by the Time/Accelergy program for convolutional layers on the Simba and Eyeriss hardware platforms. The model opens the possibility of proving principal limits on the energy efficiency of CNN hardware accelerators.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/GA22-02067S" target="_blank" >GA22-02067S: AppNeCo: Aproximativní neurovýpočty</a><br>
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2023
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Artificial Neural Networks and Machine Learning – ICANN 2023. Proceedings, Part X
ISBN
978-3-031-44203-2
ISSN
0302-9743
e-ISSN
—
Počet stran výsledku
13
Strana od-do
186-198
Název nakladatele
Springer
Místo vydání
Cham
Místo konání akce
Heraklion
Datum konání akce
26. 9. 2023
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
001157311300016