Optimized High Resolution 3D Dense-U-Net Network for Brain and Spine Segmentation
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26220%2F19%3APU130875" target="_blank" >RIV/00216305:26220/19:PU130875 - isvavai.cz</a>
Výsledek na webu
<a href="https://www.mdpi.com/2076-3417/9/3/404" target="_blank" >https://www.mdpi.com/2076-3417/9/3/404</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.3390/app9030404" target="_blank" >10.3390/app9030404</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Optimized High Resolution 3D Dense-U-Net Network for Brain and Spine Segmentation
Popis výsledku v původním jazyce
The 3D image segmentation is the process of partitioning a digital 3D volumes into multiple segments. This paper presents a fully automatic method for high resolution 3D volumetric segmentation of medical image data using modern supervised deep learning approach. We introduce 3D Dense-U-Net neural network architecture implementing densely connected layers. It has been optimized for graphic process unit accelerated high resolution image processing on currently available hardware (Nvidia GTX 1080ti). The method has been evaluated on MRI brain 3D volumetric dataset and CT thoracic scan dataset for spine segmentation. In contrast with many previous methods, our approach is capable of precise segmentation of the input image data in the original resolution, without any pre-processing of the input image. It can process image data in 3D and has achieved accuracy of 99.72% on MRI brain dataset, which outperformed results achieved by human expert. On lumbar and thoracic vertebrae CT dataset it has achieved the accuracy of 99.80%. The architecture proposed in this paper can also be easily applied to any task already using U-Net network as a segmentation algorithm to enhance its results. Complete source code was released online under open-source license.
Název v anglickém jazyce
Optimized High Resolution 3D Dense-U-Net Network for Brain and Spine Segmentation
Popis výsledku anglicky
The 3D image segmentation is the process of partitioning a digital 3D volumes into multiple segments. This paper presents a fully automatic method for high resolution 3D volumetric segmentation of medical image data using modern supervised deep learning approach. We introduce 3D Dense-U-Net neural network architecture implementing densely connected layers. It has been optimized for graphic process unit accelerated high resolution image processing on currently available hardware (Nvidia GTX 1080ti). The method has been evaluated on MRI brain 3D volumetric dataset and CT thoracic scan dataset for spine segmentation. In contrast with many previous methods, our approach is capable of precise segmentation of the input image data in the original resolution, without any pre-processing of the input image. It can process image data in 3D and has achieved accuracy of 99.72% on MRI brain dataset, which outperformed results achieved by human expert. On lumbar and thoracic vertebrae CT dataset it has achieved the accuracy of 99.80%. The architecture proposed in this paper can also be easily applied to any task already using U-Net network as a segmentation algorithm to enhance its results. Complete source code was released online under open-source license.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
20601 - Medical engineering
Návaznosti výsledku
Projekt
Výsledek vznikl pri realizaci vícero projektů. Více informací v záložce Projekty.
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2019
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Applied Sciences - Basel
ISSN
2076-3417
e-ISSN
—
Svazek periodika
9
Číslo periodika v rámci svazku
3
Stát vydavatele periodika
CH - Švýcarská konfederace
Počet stran výsledku
17
Strana od-do
1-17
Kód UT WoS článku
000459976200044
EID výsledku v databázi Scopus
2-s2.0-85060607520