All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

MuTr: Multi-Stage Transformer for Hand Pose Estimation from Full-Scene Depth Image

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F49777513%3A23520%2F23%3A43969579" target="_blank" >RIV/49777513:23520/23:43969579 - isvavai.cz</a>

  • Result on the web

    <a href="https://www.mdpi.com/1424-8220/23/12/5509" target="_blank" >https://www.mdpi.com/1424-8220/23/12/5509</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.3390/s23125509" target="_blank" >10.3390/s23125509</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    MuTr: Multi-Stage Transformer for Hand Pose Estimation from Full-Scene Depth Image

  • Original language description

    This work presents a novel transformer-based method for hand pose estimation—DePOTR. We test the DePOTR method on four benchmark datasets, where DePOTR outperforms other transformer-based methods while achieving results on par with other state-of-the-art methods. To further demonstrate the strength of DePOTR, we propose a novel multi-stage approach from full-scene depth image—MuTr. MuTr removes the necessity of having two different models in the hand pose estimation pipeline—one for hand localization and one for pose estimation—while maintaining promising results. To the best of our knowledge, this is the first successful attempt to use the same model architecture in standard and simultaneously in full-scene image setup while achieving competitive results in both of them. On the NYU dataset, DePOTR and MuTr reach precision equal to 7.85 mm and 8.71 mm, respectively.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    20205 - Automation and control systems

Result continuities

  • Project

    <a href="/en/project/EF15_003%2F0000466" target="_blank" >EF15_003/0000466: Artificial Intelligence and Reasoning</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach

Others

  • Publication year

    2023

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    SENSORS

  • ISSN

    1424-8220

  • e-ISSN

    1424-8220

  • Volume of the periodical

    23

  • Issue of the periodical within the volume

    12

  • Country of publishing house

    CH - SWITZERLAND

  • Number of pages

    16

  • Pages from-to

  • UT code for WoS article

    001017823100001

  • EID of the result in the Scopus database

    2-s2.0-85163928315