Prediction of Inference Energy on CNN Accelerators Supporting Approximate Circuits
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F23%3APU148180" target="_blank" >RIV/00216305:26230/23:PU148180 - isvavai.cz</a>
Výsledek na webu
<a href="https://ieeexplore.ieee.org/document/10139724" target="_blank" >https://ieeexplore.ieee.org/document/10139724</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/DDECS57882.2023.10139724" target="_blank" >10.1109/DDECS57882.2023.10139724</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Prediction of Inference Energy on CNN Accelerators Supporting Approximate Circuits
Popis výsledku v původním jazyce
Design methodologies developed for optimizing hardware implementations of convolutional neural networks (CNN) or searching for new hardware-aware neural architectures rely on the fast and reliable estimation of key hardware parameters, such as the energy needed for one inference. Utilizing approximate circuits in hardware accelerators of CNNs faces the designers with new problems during their simulation - commonly used tools (TimeLoop, Accelergy, Maestro) do not support approximate arithmetic operations. This work addresses the fast and efficient prediction of consumed energy in hardware accelerators of CNNs that utilize approximate circuits such as approximate multipliers. First, we extend the state-of-the-art software frameworks TimeLoop and Accelergy to predict the inference energy when exact multipliers are replaced with various approximate implementations. The energies obtained using the modified tools are then considered the ground truth (reference) values. Then, we propose and evaluate, using two accelerators (Eyeriss and Simba) and two types of networks (CNNs generated by EvoApproxNAS and standard ResNet CNNs), two predictors of inference energy. We conclude that a simple predictor based on summing the energies needed for all multiplications highly correlates with the reference values if the CNN's architecture is fixed. For complex CNNs with variable architectures typically generated by neural architecture search algorithms, a more sophisticated predictor based on a machine learning model has to be employed. The proposed predictors are 420-533× faster than reference solutions.
Název v anglickém jazyce
Prediction of Inference Energy on CNN Accelerators Supporting Approximate Circuits
Popis výsledku anglicky
Design methodologies developed for optimizing hardware implementations of convolutional neural networks (CNN) or searching for new hardware-aware neural architectures rely on the fast and reliable estimation of key hardware parameters, such as the energy needed for one inference. Utilizing approximate circuits in hardware accelerators of CNNs faces the designers with new problems during their simulation - commonly used tools (TimeLoop, Accelergy, Maestro) do not support approximate arithmetic operations. This work addresses the fast and efficient prediction of consumed energy in hardware accelerators of CNNs that utilize approximate circuits such as approximate multipliers. First, we extend the state-of-the-art software frameworks TimeLoop and Accelergy to predict the inference energy when exact multipliers are replaced with various approximate implementations. The energies obtained using the modified tools are then considered the ground truth (reference) values. Then, we propose and evaluate, using two accelerators (Eyeriss and Simba) and two types of networks (CNNs generated by EvoApproxNAS and standard ResNet CNNs), two predictors of inference energy. We conclude that a simple predictor based on summing the energies needed for all multiplications highly correlates with the reference values if the CNN's architecture is fixed. For complex CNNs with variable architectures typically generated by neural architecture search algorithms, a more sophisticated predictor based on a machine learning model has to be employed. The proposed predictors are 420-533× faster than reference solutions.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/GA22-02067S" target="_blank" >GA22-02067S: AppNeCo: Aproximativní neurovýpočty</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2023
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
2023 26th International Symposium on Design and Diagnostics of Electronic Circuits and Systems
ISBN
979-8-3503-3277-3
ISSN
—
e-ISSN
—
Počet stran výsledku
6
Strana od-do
45-50
Název nakladatele
Institute of Electrical and Electronics Engineers
Místo vydání
Talinn
Místo konání akce
Tallinn
Datum konání akce
3. 5. 2023
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
001012062000008