All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Image recognition based on deep learning in Haemonchus contortus motility assays

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11160%2F22%3A10450813" target="_blank" >RIV/00216208:11160/22:10450813 - isvavai.cz</a>

  • Result on the web

    <a href="https://verso.is.cuni.cz/pub/verso.fpl?fname=obd_publikace_handle&handle=kDJ8L4IKjP" target="_blank" >https://verso.is.cuni.cz/pub/verso.fpl?fname=obd_publikace_handle&handle=kDJ8L4IKjP</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1016/j.csbj.2022.05.014" target="_blank" >10.1016/j.csbj.2022.05.014</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Image recognition based on deep learning in Haemonchus contortus motility assays

  • Original language description

    Poor efficacy of some anthelmintics and rising concerns about the widespread drug resistance have highlighted the need for new drug discovery. The parasitic nematode Haemonchus contortus is an important model organism widely used for studies of drug resistance and drug screening with the current gold standard being the motility assay. We applied a deep learning approach Mask R-CNN for analysing motility videos containing varying rates of motile worms and compared it to other commonly used algorithms with different levels of complexity, namely the Wiggle Index and the Wide Field-of-View Nematode Tracking Platform. Mask R-CNN consistently outperformed the other algorithms in terms of the detection of worms as well as the precision of motility forecasts, having a mean absolute percentage error of 7.6% and a mean absolute error of 5.6% for the detection and motility forecasts, respectively. Using Mask R-CNN for motility assays confirmed the common problem with algorithms that use non-maximum suppression in detecting overlapping objects, which negatively impacts the overall precision. The use of intersect over union as a measure of the classification of motile / non-motile instances had an overall accuracy of 89%, indicating that it is a viable alternative to previously used methods based on movement characteristics, such as body bends. In comparison to the existing methods evaluated here, Mask R-CNN performed better and we anticipate that this method will broaden the number of possible approaches to video analysis of worm motility.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    30104 - Pharmacology and pharmacy

Result continuities

  • Project

    <a href="/en/project/EF16_019%2F0000841" target="_blank" >EF16_019/0000841: Efficiency and safety improvement of current drugs and nutraceuticals: advanced methods - new challenges</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach<br>I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2022

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Computational and Structural Biotechnology Journal

  • ISSN

    2001-0370

  • e-ISSN

  • Volume of the periodical

    20

  • Issue of the periodical within the volume

    May

  • Country of publishing house

    SE - SWEDEN

  • Number of pages

    9

  • Pages from-to

    2372-2380

  • UT code for WoS article

    000805642000003

  • EID of the result in the Scopus database