All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Multipatch-GLCM for Texture Feature Extraction on Classification of the Colon Histopathology Images using Deep Neural Network with GPU Acceleration

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216275%3A25530%2F20%3A39917094" target="_blank" >RIV/00216275:25530/20:39917094 - isvavai.cz</a>

  • Result on the web

    <a href="https://thescipub.com/pdf/jcssp.2020.280.294.pdf" target="_blank" >https://thescipub.com/pdf/jcssp.2020.280.294.pdf</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.3844/jcssp.2020.280.294" target="_blank" >10.3844/jcssp.2020.280.294</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Multipatch-GLCM for Texture Feature Extraction on Classification of the Colon Histopathology Images using Deep Neural Network with GPU Acceleration

  • Original language description

    Cancer is one of the leading causes of death in the world. It is the main reason why research in this field becomes challenging. Not only for the pathologist but also from the view of a computer scientist. Hematoxylin and Eosin (H&amp;E) images are the most common modalities used by the pathologist for cancer detection. The status of cancer with histopathology images can be classified based on the shape, morphology, intensity, and texture of the image. The use of full high-resolution histopathology images will take a longer time for the extraction of all information due to the huge amount of data. This study proposed advance texture extraction by multi-patch images pixel method with sliding windows that minimize loss of information in each pixel patch. We use texture feature Gray Level Co-Occurrence Matrix (GLCM) with a mean-shift filter as the data pre-processing of the images. The mean-shift filter is a low-pass filter technique that considers the surrounding pixels of the images. The proposed GLCM method is then trained using Deep Neural Networks (DNN) and compared to other classification techniques for benchmarking. For training, we use two hardware: NVIDIA GPU GTX-980 and TESLA K40c. According to the study, Deep Neural Network outperforms other classifiers with the highest accuracy and deviation standard 96.72±0.48 for four cross-validations. The additional information is that training using Theano framework is faster than Tensorflow for both in GTX-980 and Tesla K40c.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>SC</sub> - Article in a specialist periodical, which is included in the SCOPUS database

  • CEP classification

  • OECD FORD branch

    20202 - Communication engineering and systems

Result continuities

  • Project

    <a href="/en/project/EF17_049%2F0008394" target="_blank" >EF17_049/0008394: Cooperation in Applied Research between the University of Pardubice and companies, in the Field of Positioning, Detection and Simulation Technology for Transport Systems (PosiTrans)</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2020

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Journal of Computer Science

  • ISSN

    1549-3636

  • e-ISSN

  • Volume of the periodical

    Volume 16

  • Issue of the periodical within the volume

    No. 3

  • Country of publishing house

    AE - UNITED ARAB EMIRATES

  • Number of pages

    15

  • Pages from-to

    280-294

  • UT code for WoS article

  • EID of the result in the Scopus database

    2-s2.0-85086875982