Project MultiLeap: Making Multiple Hand Tracking Sensors to Act Like One
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21240%2F21%3A00353691" target="_blank" >RIV/68407700:21240/21:00353691 - isvavai.cz</a>
Výsledek na webu
<a href="https://ieeexplore.ieee.org/abstract/document/9644262" target="_blank" >https://ieeexplore.ieee.org/abstract/document/9644262</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/AIVR52153.2021.00021" target="_blank" >10.1109/AIVR52153.2021.00021</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Project MultiLeap: Making Multiple Hand Tracking Sensors to Act Like One
Popis výsledku v původním jazyce
We present a concept that provides hand tracking for virtual and extended reality, only with the use of optical sensors, without the need for the user to hold any physical controller. In this article, we propose five new algorithms further to extend our previous research and the possibilities of the hand tracking system whilst also making it more precise. The first algorithm deals with the need to calibrate the tracking system. Thanks to the new approach, we improved tracking precision by 37% over our previous solution. The second algorithm allows us to compute the precision of the hand tracking data when multiple sensors are used. The third algorithm further improves the computation of hand tracking data confidence by correctly handling the edge cases, for example, when the tracked hand is at the edge of the sensor's field of view. The fourth algorithm provides a new way to fuse the hand tracking data by using only the hand tracking data with the highest hand tracking data confidence. The fifth algorithm deals with the issue when the optical sensor misclassifies the hand chirality.
Název v anglickém jazyce
Project MultiLeap: Making Multiple Hand Tracking Sensors to Act Like One
Popis výsledku anglicky
We present a concept that provides hand tracking for virtual and extended reality, only with the use of optical sensors, without the need for the user to hold any physical controller. In this article, we propose five new algorithms further to extend our previous research and the possibilities of the hand tracking system whilst also making it more precise. The first algorithm deals with the need to calibrate the tracking system. Thanks to the new approach, we improved tracking precision by 37% over our previous solution. The second algorithm allows us to compute the precision of the hand tracking data when multiple sensors are used. The third algorithm further improves the computation of hand tracking data confidence by correctly handling the edge cases, for example, when the tracked hand is at the edge of the sensor's field of view. The fourth algorithm provides a new way to fuse the hand tracking data by using only the hand tracking data with the highest hand tracking data confidence. The fifth algorithm deals with the issue when the optical sensor misclassifies the hand chirality.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
—
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2021
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Proceedings of 2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)
ISBN
978-1-6654-3225-2
ISSN
—
e-ISSN
—
Počet stran výsledku
7
Strana od-do
77-83
Název nakladatele
IEEE
Místo vydání
Beijing
Místo konání akce
Taichung
Datum konání akce
15. 11. 2021
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—