Refined Max-Pooling and Unpooling Layers for Deep Convolutional Neural Networks
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216275%3A25530%2F16%3A39902263" target="_blank" >RIV/00216275:25530/16:39902263 - isvavai.cz</a>
Výsledek na webu
—
DOI - Digital Object Identifier
—
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Refined Max-Pooling and Unpooling Layers for Deep Convolutional Neural Networks
Popis výsledku v původním jazyce
The main goal of this paper is the introduction of new pooling and unpooling layers suited for deep convolutional neural networks. To this end, a new approximation of max-pooling inversion has been designed. The idea behind this approximation is also introduced in this paper. It is demonstrated on pools of size 2 x 2, with a stride of 2. The widely used technique of switches is combined with interpolation to form the new approximation. For that purpose, an unconventional expression of the switches has been used. Such an expression, allows the right placement of maxima in a reconstruction of original data, as well as interpolation of all unknown values in the reconstruction using the known maxima. The introduced inversion has been implemented into the aforementioned refined pooling and unpooling layers. Since they are suited for deep convolutional networks, behavior of the layers in the feed-forward and backpropagation passes had to be solved. In this context, the introduced conception of the switches has been further developed. Specifically, feed-forward and backpropagation switches are considered in the refined layers. One version of feed-forward and three versions of backpropagation switches have been introduced within this paper. The refined pooling and unpooling layers have been tested on a simple convolutional auto-encoder in order to verify functionality of the conception.
Název v anglickém jazyce
Refined Max-Pooling and Unpooling Layers for Deep Convolutional Neural Networks
Popis výsledku anglicky
The main goal of this paper is the introduction of new pooling and unpooling layers suited for deep convolutional neural networks. To this end, a new approximation of max-pooling inversion has been designed. The idea behind this approximation is also introduced in this paper. It is demonstrated on pools of size 2 x 2, with a stride of 2. The widely used technique of switches is combined with interpolation to form the new approximation. For that purpose, an unconventional expression of the switches has been used. Such an expression, allows the right placement of maxima in a reconstruction of original data, as well as interpolation of all unknown values in the reconstruction using the known maxima. The introduced inversion has been implemented into the aforementioned refined pooling and unpooling layers. Since they are suited for deep convolutional networks, behavior of the layers in the feed-forward and backpropagation passes had to be solved. In this context, the introduced conception of the switches has been further developed. Specifically, feed-forward and backpropagation switches are considered in the refined layers. One version of feed-forward and three versions of backpropagation switches have been introduced within this paper. The refined pooling and unpooling layers have been tested on a simple convolutional auto-encoder in order to verify functionality of the conception.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
BD - Teorie informace
OECD FORD obor
—
Návaznosti výsledku
Projekt
—
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2016
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Mendel 2016 : 22nd International Conference on Soft Computing
ISBN
978-80-214-5365-4
ISSN
1803-3814
e-ISSN
—
Počet stran výsledku
12
Strana od-do
131-142
Název nakladatele
Vysoké učení technické v Brně
Místo vydání
Brno
Místo konání akce
Brno
Datum konání akce
8. 6. 2016
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—