Towards Fast Fiducial Marker with full 6 DOF Pose Estimation
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F22%3A00361194" target="_blank" >RIV/68407700:21230/22:00361194 - isvavai.cz</a>
Výsledek na webu
<a href="https://dl.acm.org/doi/pdf/10.1145/3477314.3507043" target="_blank" >https://dl.acm.org/doi/pdf/10.1145/3477314.3507043</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1145/3477314.3507043" target="_blank" >10.1145/3477314.3507043</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Towards Fast Fiducial Marker with full 6 DOF Pose Estimation
Popis výsledku v původním jazyce
This paper proposes a new method for the full 6 degrees of free- dom pose estimation of a circular fiducial marker. This circular black-and-white planar marker provides a unique and versatile identification of individual markers while maintaining a real-time detection. Such a marker and the vision localisation system based on it is suitable for both external and self-localisation. Together with an off-the-shelf camera, the marker aims to provide a sufficient pose estimation accuracy to substitute the current high-end locali sation systems. In order to assess the performance of our proposed marker system, we evaluate its capabilities against the current state of-the-art methods in terms of their ability to estimate the 2D and 3D positions. For such purpose, a real-world dataset, inspired by typical applications in mobile and swarm robotics, was collected as the performance under the real conditions provides better insights into the method’s potential than an artificially simulated environ ment. The experiments performed show that the method presented here achieved three times the accuracy of the marker it was derived from.
Název v anglickém jazyce
Towards Fast Fiducial Marker with full 6 DOF Pose Estimation
Popis výsledku anglicky
This paper proposes a new method for the full 6 degrees of free- dom pose estimation of a circular fiducial marker. This circular black-and-white planar marker provides a unique and versatile identification of individual markers while maintaining a real-time detection. Such a marker and the vision localisation system based on it is suitable for both external and self-localisation. Together with an off-the-shelf camera, the marker aims to provide a sufficient pose estimation accuracy to substitute the current high-end locali sation systems. In order to assess the performance of our proposed marker system, we evaluate its capabilities against the current state of-the-art methods in terms of their ability to estimate the 2D and 3D positions. For such purpose, a real-world dataset, inspired by typical applications in mobile and swarm robotics, was collected as the performance under the real conditions provides better insights into the method’s potential than an artificially simulated environ ment. The experiments performed show that the method presented here achieved three times the accuracy of the marker it was derived from.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/GC20-27034J" target="_blank" >GC20-27034J: Rozšíření prostorových modelů explicitní representací času pro dlouhodobou autonomii mobilních robotů</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing
ISBN
978-1-4503-8713-2
ISSN
—
e-ISSN
—
Počet stran výsledku
8
Strana od-do
723-730
Název nakladatele
ACM
Místo vydání
New York
Místo konání akce
Virtual
Datum konání akce
25. 4. 2022
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—