Triple Parallel LSTM Networks for Classifying the Gait Disorders Using Kinect Camera and Robot Platform During the Clinical Examination
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216275%3A25530%2F23%3A39920921" target="_blank" >RIV/00216275:25530/23:39920921 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1109/ICECCME57830.2023.10252459" target="_blank" >http://dx.doi.org/10.1109/ICECCME57830.2023.10252459</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/ICECCME57830.2023.10252459" target="_blank" >10.1109/ICECCME57830.2023.10252459</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Triple Parallel LSTM Networks for Classifying the Gait Disorders Using Kinect Camera and Robot Platform During the Clinical Examination
Popis výsledku v původním jazyce
This paper presents a new methodology for the data processing and classification method for gait disorders, which is observed with a Kinect camera. The study of gait and motion stability in gait disorders is one of the most interesting research areas in the field. The patient and the physician must monitor the progress of the rehabilitation process before and after surgery to obtain an objective view of the rehabilitation process. In this study, the patient is scanned with the Kinect camera placed on a mobile robotic platform. For feature extraction and feature analysis, the exercises (three walking exercises) frames are collected and saved in data folders. This study uses 84 measurements of 37 patients with complex observations based on the physician's opinion in a clinical setting to address classification problems. In the analysis of gait disorders, motion data play an essential role. Furthermore, it reduces the selection of helpful body features for assessing gait disorders. The proposed system uses a key-point detector that computes body landmarks and classifies gait disorders using triple-parallel long short-term memory (LSTM) networks. The present study demonstrates the success of the method in classification evaluation when combined with the state-of-the-art pose estimation method. Around 81 percent accuracy was achieved for given sets of individuals using velocity-based, angle-based, and position-based features.
Název v anglickém jazyce
Triple Parallel LSTM Networks for Classifying the Gait Disorders Using Kinect Camera and Robot Platform During the Clinical Examination
Popis výsledku anglicky
This paper presents a new methodology for the data processing and classification method for gait disorders, which is observed with a Kinect camera. The study of gait and motion stability in gait disorders is one of the most interesting research areas in the field. The patient and the physician must monitor the progress of the rehabilitation process before and after surgery to obtain an objective view of the rehabilitation process. In this study, the patient is scanned with the Kinect camera placed on a mobile robotic platform. For feature extraction and feature analysis, the exercises (three walking exercises) frames are collected and saved in data folders. This study uses 84 measurements of 37 patients with complex observations based on the physician's opinion in a clinical setting to address classification problems. In the analysis of gait disorders, motion data play an essential role. Furthermore, it reduces the selection of helpful body features for assessing gait disorders. The proposed system uses a key-point detector that computes body landmarks and classifies gait disorders using triple-parallel long short-term memory (LSTM) networks. The present study demonstrates the success of the method in classification evaluation when combined with the state-of-the-art pose estimation method. Around 81 percent accuracy was achieved for given sets of individuals using velocity-based, angle-based, and position-based features.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
20204 - Robotics and automatic control
Návaznosti výsledku
Projekt
—
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2023
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
International Conference on Electrical, Computer, Communications and Mechatronics Engineering, ICECCME 2023 : proceedings
ISBN
979-8-3503-2298-9
ISSN
—
e-ISSN
—
Počet stran výsledku
6
Strana od-do
1-6
Název nakladatele
IEEE (Institute of Electrical and Electronics Engineers)
Místo vydání
New York
Místo konání akce
Tenerife
Datum konání akce
19. 7. 2023
Typ akce podle státní příslušnosti
EUR - Evropská akce
Kód UT WoS článku
—