Hyperparameter Tuning in Echo State Networks
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F22%3A10455492" target="_blank" >RIV/00216208:11320/22:10455492 - isvavai.cz</a>
Výsledek na webu
<a href="https://doi.org/10.1145/3512290.3528721" target="_blank" >https://doi.org/10.1145/3512290.3528721</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1145/3512290.3528721" target="_blank" >10.1145/3512290.3528721</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Hyperparameter Tuning in Echo State Networks
Popis výsledku v původním jazyce
Echo State Networks represent a type of recurrent neural network with a large randomly generated reservoir and a small number of readout connections trained via linear regression. The most common topology of the reservoir is a fully connected network of up to thousands of neurons. Over the years, researchers have introduced a variety of alternative reservoir topologies, such as a circular network or a linear path of connections. When comparing the performance of different topologies or other architectural changes, it is necessary to tune the hyperparameters for each of the topologies separately since their properties may significantly differ. The hyperparameter tuning is usually carried out manually by selecting the best performing set of parameters from a sparse grid of predefined combinations. Unfortunately, this approach may lead to underperforming configurations, especially for sensitive topologies. We propose an alternative approach of hyperparameter tuning based on the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Using this approach, we have improved multiple topology comparison results by orders of magnitude suggesting that topology alone does not play as important role as properly tuned hyperparameters.
Název v anglickém jazyce
Hyperparameter Tuning in Echo State Networks
Popis výsledku anglicky
Echo State Networks represent a type of recurrent neural network with a large randomly generated reservoir and a small number of readout connections trained via linear regression. The most common topology of the reservoir is a fully connected network of up to thousands of neurons. Over the years, researchers have introduced a variety of alternative reservoir topologies, such as a circular network or a linear path of connections. When comparing the performance of different topologies or other architectural changes, it is necessary to tune the hyperparameters for each of the topologies separately since their properties may significantly differ. The hyperparameter tuning is usually carried out manually by selecting the best performing set of parameters from a sparse grid of predefined combinations. Unfortunately, this approach may lead to underperforming configurations, especially for sensitive topologies. We propose an alternative approach of hyperparameter tuning based on the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Using this approach, we have improved multiple topology comparison results by orders of magnitude suggesting that topology alone does not play as important role as properly tuned hyperparameters.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
—
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
GECCO '22: Proceedings of the Genetic and Evolutionary Computation Conference
ISBN
978-1-4503-9237-2
ISSN
—
e-ISSN
—
Počet stran výsledku
9
Strana od-do
404-412
Název nakladatele
Association for Computing Machinery
Místo vydání
New York
Místo konání akce
Boston
Datum konání akce
9. 7. 2022
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000847380200048