A multimodal smartwatch-based interaction concept for immersive environments
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216224%3A14330%2F23%3A00132189" target="_blank" >RIV/00216224:14330/23:00132189 - isvavai.cz</a>
Výsledek na webu
<a href="https://www.sciencedirect.com/science/article/pii/S0097849323002479" target="_blank" >https://www.sciencedirect.com/science/article/pii/S0097849323002479</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1016/j.cag.2023.10.010" target="_blank" >10.1016/j.cag.2023.10.010</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
A multimodal smartwatch-based interaction concept for immersive environments
Popis výsledku v původním jazyce
Augmented and Virtual Reality (AR/VR) environments require user interaction concepts beyond the traditional mouse-and-keyboard setup for seated desktop computer usage. Although advanced input modalities such as hand or gaze tracking have been developed, they have yet to be widely adopted in available hardware. Modern smartwatches have been shown to provide a powerful and intuitive means of input, thereby overcoming the limitation of the current AR/VR headsets. They typically offer a set of interesting input modalities, such as a touchscreen, rotary buttons, and an Inertial Measurement Unit (IMU), which can be used for mid-air gesture recognition. Compared to other input devices, they have the benefit that they are hands-free as soon as the user stops interacting since they are attached to the wrist. As many concepts have been proposed, comparative evaluations of their effectiveness and user-friendliness are still rare. In this paper, we evaluate the usability of two commonly found approaches for using a smartwatch as an interaction device, specifically in immersive environments provided by AR/VR HMDs: using the physical inputs of the watch (touchscreen, rotary buttons) or mid-air gestures. We conducted a user study with 20 participants, where they tested both of the interaction methods, and we compared them in their usability and performance. Based on a prototypical AR application, we evaluated the performance and user experience of these two smartwatch-based interaction concepts. We have found that the input using a touchscreen and buttons was generally favored by the participants and led to shorter task completion times.
Název v anglickém jazyce
A multimodal smartwatch-based interaction concept for immersive environments
Popis výsledku anglicky
Augmented and Virtual Reality (AR/VR) environments require user interaction concepts beyond the traditional mouse-and-keyboard setup for seated desktop computer usage. Although advanced input modalities such as hand or gaze tracking have been developed, they have yet to be widely adopted in available hardware. Modern smartwatches have been shown to provide a powerful and intuitive means of input, thereby overcoming the limitation of the current AR/VR headsets. They typically offer a set of interesting input modalities, such as a touchscreen, rotary buttons, and an Inertial Measurement Unit (IMU), which can be used for mid-air gesture recognition. Compared to other input devices, they have the benefit that they are hands-free as soon as the user stops interacting since they are attached to the wrist. As many concepts have been proposed, comparative evaluations of their effectiveness and user-friendliness are still rare. In this paper, we evaluate the usability of two commonly found approaches for using a smartwatch as an interaction device, specifically in immersive environments provided by AR/VR HMDs: using the physical inputs of the watch (touchscreen, rotary buttons) or mid-air gestures. We conducted a user study with 20 participants, where they tested both of the interaction methods, and we compared them in their usability and performance. Based on a prototypical AR application, we evaluated the performance and user experience of these two smartwatch-based interaction concepts. We have found that the input using a touchscreen and buttons was generally favored by the participants and led to shorter task completion times.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10200 - Computer and information sciences
Návaznosti výsledku
Projekt
—
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2023
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Computers & Graphics
ISSN
0097-8493
e-ISSN
—
Svazek periodika
117
Číslo periodika v rámci svazku
December
Stát vydavatele periodika
GB - Spojené království Velké Británie a Severního Irska
Počet stran výsledku
11
Strana od-do
85-95
Kód UT WoS článku
001107076300001
EID výsledku v databázi Scopus
2-s2.0-85174924394