Denoise pre-training for segmentation neural networks
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26220%2F19%3APU132911" target="_blank" >RIV/00216305:26220/19:PU132911 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Denoise pre-training for segmentation neural networks
Original language description
This paper proposes a method for pre-training segmentation neural networks on small datasets using unlabelled training data with added noise. The pre-training process helps the network with initial better weights settings for the training itself and also augments the training dataset when dealing with small labelled datasets especially in medical imaging. The experiment comparing results of pre-trained and not pre-trained networks on MRI brain segmentation task has shown that the denoise pre-training helps the network with faster training convergence without overfitting and achieving better results in all compared metrics even on very small datasets.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2019
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Proceedings of the 25th Conference STUDENT EEICT 2019
ISBN
978-80-214-5735-5
ISSN
—
e-ISSN
—
Number of pages
5
Pages from-to
739-744
Publisher name
Vysoké učení technické v Brně, Fakulta elektrotechniky a komunikačních technologií
Place of publication
Brno
Event location
Brno
Event date
Apr 25, 2019
Type of event by nationality
CST - Celostátní akce
UT code for WoS article
—