Options for Automatic Identification of User Activities in Usability Testing
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F60460709%3A41110%2F19%3A80687" target="_blank" >RIV/60460709:41110/19:80687 - isvavai.cz</a>
Výsledek na webu
—
DOI - Digital Object Identifier
—
Alternativní jazyky
Jazyk výsledku
čeština
Název v původním jazyce
Options for Automatic Identification of User Activities in Usability Testing
Popis výsledku v původním jazyce
When testing usability of applications, it is often needed to analyze behavior of users in terms of identifying their activities. The activity may be that the user is working on the assignment without problems, is searching for something, is absolutely lost in user interface, is filling a form, is studying the manual, etc. Identifying of the activities is usually done by tagging a video and audio record of the testing, optionally together with visualization of eye movements (eye tracking). It is a very time-consuming work for the usability experts. When testing in a specialized laboratory, we can obtain data from various measurements. Besides audio-visual record, data from eye-tracking, click tracking and keyboard tracking can be analyzed. Moreover, we can engage biometrical data such as pulse, skin temperature, humidity, hand movements, etc. The research question of the paper is whether it is possible to analyze all the data and develop methods and algorithms to automatically identify the user activ
Název v anglickém jazyce
Options for Automatic Identification of User Activities in Usability Testing
Popis výsledku anglicky
When testing usability of applications, it is often needed to analyze behavior of users in terms of identifying their activities. The activity may be that the user is working on the assignment without problems, is searching for something, is absolutely lost in user interface, is filling a form, is studying the manual, etc. Identifying of the activities is usually done by tagging a video and audio record of the testing, optionally together with visualization of eye movements (eye tracking). It is a very time-consuming work for the usability experts. When testing in a specialized laboratory, we can obtain data from various measurements. Besides audio-visual record, data from eye-tracking, click tracking and keyboard tracking can be analyzed. Moreover, we can engage biometrical data such as pulse, skin temperature, humidity, hand movements, etc. The research question of the paper is whether it is possible to analyze all the data and develop methods and algorithms to automatically identify the user activ
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
—
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2019
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
EFITA-HAICTA-WCCA CONGRESS
ISBN
978-618-84798-0-7
ISSN
—
e-ISSN
—
Počet stran výsledku
251
Strana od-do
0-251
Název nakladatele
Neuveden
Místo vydání
Rhodes
Místo konání akce
Rhodes, Greece
Datum konání akce
27. 1. 2020
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—