Options for Automatic Identification of User Activities in Usability Testing
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F60460709%3A41110%2F19%3A80687" target="_blank" >RIV/60460709:41110/19:80687 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
čeština
Original language name
Options for Automatic Identification of User Activities in Usability Testing
Original language description
When testing usability of applications, it is often needed to analyze behavior of users in terms of identifying their activities. The activity may be that the user is working on the assignment without problems, is searching for something, is absolutely lost in user interface, is filling a form, is studying the manual, etc. Identifying of the activities is usually done by tagging a video and audio record of the testing, optionally together with visualization of eye movements (eye tracking). It is a very time-consuming work for the usability experts. When testing in a specialized laboratory, we can obtain data from various measurements. Besides audio-visual record, data from eye-tracking, click tracking and keyboard tracking can be analyzed. Moreover, we can engage biometrical data such as pulse, skin temperature, humidity, hand movements, etc. The research question of the paper is whether it is possible to analyze all the data and develop methods and algorithms to automatically identify the user activ
Czech name
Options for Automatic Identification of User Activities in Usability Testing
Czech description
When testing usability of applications, it is often needed to analyze behavior of users in terms of identifying their activities. The activity may be that the user is working on the assignment without problems, is searching for something, is absolutely lost in user interface, is filling a form, is studying the manual, etc. Identifying of the activities is usually done by tagging a video and audio record of the testing, optionally together with visualization of eye movements (eye tracking). It is a very time-consuming work for the usability experts. When testing in a specialized laboratory, we can obtain data from various measurements. Besides audio-visual record, data from eye-tracking, click tracking and keyboard tracking can be analyzed. Moreover, we can engage biometrical data such as pulse, skin temperature, humidity, hand movements, etc. The research question of the paper is whether it is possible to analyze all the data and develop methods and algorithms to automatically identify the user activ
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2019
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
EFITA-HAICTA-WCCA CONGRESS
ISBN
978-618-84798-0-7
ISSN
—
e-ISSN
—
Number of pages
251
Pages from-to
0-251
Publisher name
Neuveden
Place of publication
Rhodes
Event location
Rhodes, Greece
Event date
Jan 27, 2020
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—