Evolutionary Approximation and Neural Architecture Search
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F22%3APU145799" target="_blank" >RIV/00216305:26230/22:PU145799 - isvavai.cz</a>
Výsledek na webu
<a href="https://link.springer.com/article/10.1007/s10710-022-09441-z" target="_blank" >https://link.springer.com/article/10.1007/s10710-022-09441-z</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/s10710-022-09441-z" target="_blank" >10.1007/s10710-022-09441-z</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Evolutionary Approximation and Neural Architecture Search
Popis výsledku v původním jazyce
Automated neural architecture search (NAS) methods are now employed to routinely deliver high-quality neural network architectures for various challenging data sets and reduce the designers effort. The NAS methods utilizing multi-objective evolutionary algorithms are especially useful when the objective is not only to minimize the network error but also to reduce the number of parameters (weights) or power consumption of the inference phase. We propose a multi-objective NAS method based on Cartesian genetic programming for evolving convolutional neural networks (CNN). The method allows approximate operations to be used in CNNs to reduce the power consumption of a target hardware implementation. During the NAS process, a suitable CNN architecture is evolved together with selecting approximate multipliers to deliver the best trade-offs between accuracy, network size, and power consumption. The most suitable 8 x N-bit approximate multipliers are automatically selected from a library of approximate multipliers. Evolved CNNs are compared with CNNs developed by other NAS methods on the CIFAR-10 and SVHN benchmark problems.
Název v anglickém jazyce
Evolutionary Approximation and Neural Architecture Search
Popis výsledku anglicky
Automated neural architecture search (NAS) methods are now employed to routinely deliver high-quality neural network architectures for various challenging data sets and reduce the designers effort. The NAS methods utilizing multi-objective evolutionary algorithms are especially useful when the objective is not only to minimize the network error but also to reduce the number of parameters (weights) or power consumption of the inference phase. We propose a multi-objective NAS method based on Cartesian genetic programming for evolving convolutional neural networks (CNN). The method allows approximate operations to be used in CNNs to reduce the power consumption of a target hardware implementation. During the NAS process, a suitable CNN architecture is evolved together with selecting approximate multipliers to deliver the best trade-offs between accuracy, network size, and power consumption. The most suitable 8 x N-bit approximate multipliers are automatically selected from a library of approximate multipliers. Evolved CNNs are compared with CNNs developed by other NAS methods on the CIFAR-10 and SVHN benchmark problems.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/GA21-13001S" target="_blank" >GA21-13001S: Automatizovaný návrh hardwarových akcelerátorů pro strojového učení zohledňující výpočetní zdroje</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Genetic Programming and Evolvable Machines
ISSN
1389-2576
e-ISSN
1573-7632
Svazek periodika
23
Číslo periodika v rámci svazku
3
Stát vydavatele periodika
US - Spojené státy americké
Počet stran výsledku
24
Strana od-do
351-374
Kód UT WoS článku
000810226500001
EID výsledku v databázi Scopus
2-s2.0-85131746167