Improved Indirect Virtual Objects Selection Methods for Cluttered Augmented Reality Environments on Mobile Devices
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F22%3APU143690" target="_blank" >RIV/00216305:26230/22:PU143690 - isvavai.cz</a>
Výsledek na webu
<a href="https://www.fit.vut.cz/research/publication/12654/" target="_blank" >https://www.fit.vut.cz/research/publication/12654/</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/HRI53351.2022.9889374" target="_blank" >10.1109/HRI53351.2022.9889374</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Improved Indirect Virtual Objects Selection Methods for Cluttered Augmented Reality Environments on Mobile Devices
Popis výsledku v původním jazyce
The problem of selecting virtual objects within augmented reality on handheld devices has been tackled multiple times. However, evaluations were carried out on purely synthetic tasks with uniformly placed homogeneous objects, often located on a plane and with none or low occlusions. This paper presents two novel approaches to indirect object selection dealing with highly occluded objects with large spatial distribution variability and heterogeneous size and appearance. The methods are designed to enable long-term usage with a tablet-like device. One method is based on a spatially anchored hierarchy menu, and the other utilizes a crosshair and a side menu that shows candidate objects according to a custom-developed metric. The proposed approaches are compared with direct touch in the context of spatial visual programming of collaborative robots problem, on a realistic workplace and a common robotic task. The preliminary evaluation indicates that the main benefit of the proposed indirect methods could be their higher precision and higher selection confidence for the user.
Název v anglickém jazyce
Improved Indirect Virtual Objects Selection Methods for Cluttered Augmented Reality Environments on Mobile Devices
Popis výsledku anglicky
The problem of selecting virtual objects within augmented reality on handheld devices has been tackled multiple times. However, evaluations were carried out on purely synthetic tasks with uniformly placed homogeneous objects, often located on a plane and with none or low occlusions. This paper presents two novel approaches to indirect object selection dealing with highly occluded objects with large spatial distribution variability and heterogeneous size and appearance. The methods are designed to enable long-term usage with a tablet-like device. One method is based on a spatially anchored hierarchy menu, and the other utilizes a crosshair and a side menu that shows candidate objects according to a custom-developed metric. The proposed approaches are compared with direct touch in the context of spatial visual programming of collaborative robots problem, on a realistic workplace and a common robotic task. The preliminary evaluation indicates that the main benefit of the proposed indirect methods could be their higher precision and higher selection confidence for the user.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/FV40052" target="_blank" >FV40052: Test-it-off: robotizované offline testování produktů</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
HRI '22: Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction
ISBN
978-1-6654-0731-1
ISSN
—
e-ISSN
—
Počet stran výsledku
5
Strana od-do
834-838
Název nakladatele
Association for Computing Machinery
Místo vydání
Sapporo, Hokkaido
Místo konání akce
Sapporo
Datum konání akce
7. 3. 2022
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000869793600112