Bimodal HAR-An efficient approach to human activity analysis and recognition using bimodal hybrid classifiers
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F62690094%3A18470%2F23%3A50020313" target="_blank" >RIV/62690094:18470/23:50020313 - isvavai.cz</a>
Result on the web
<a href="https://www.sciencedirect.com/science/article/pii/S0020025523001342" target="_blank" >https://www.sciencedirect.com/science/article/pii/S0020025523001342</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1016/j.ins.2023.01.121" target="_blank" >10.1016/j.ins.2023.01.121</a>
Alternative languages
Result language
angličtina
Original language name
Bimodal HAR-An efficient approach to human activity analysis and recognition using bimodal hybrid classifiers
Original language description
Human activity recognition (HAR) is an emerging field that identifies human actions in different settings. This activity is recognized by sensors placed in the room or residence where we wish to observe human action. Real-world applications and automation employ activity recognition to detect anomalous behavior. For example, the anomalous behavior of patients such as walking while advised to rest in bed and falling elderly people need to be monitored carefully in hospitals as well as in home-based monitoring systems. Security, healthcare, human interaction, and computer vision use it. The activity is monitored through sensors and cameras. There is no general, explicit approach for inferring human activities from sensor data. Sensor data and heuristics present technological challenges. Several elements must be evaluated to build a reliable activity recognition system. Factors such as storage, connectivity, processing, energy efficiency, and system adaptability are important. Deep learning systems can better recognize human activities from earlier datasets. In this study, the hybrid One Dimensional Convolution Neural Network with Long Short Term Memory (LSTM) classifier is employed to improve the performance of HAR. It offers a method for automatically and data-adaptively removing reliable characteristics from raw data. This model proposes a two-way classification for abstract and individual activity monitoring. Human activities such as walking, sitting, walking downstairs, walking upstairs, laying, and standing along with mobile phone usage are considered in this study. We also compare state-of-the-art algorithms such as Support Vector Machine (SVM), K-Nearest Neighbour (KNN), Long Short Term Memory (LSTM), and Convolutional Neural Network (CNN). The UCI-HAR dataset is used for recognizing human activity in the proposed work. Features such as mean, median, and autoregressive coefficients are derived from the raw data and processed with principal component analysis to make them more reliable. The LSTM model accepts a series of activities, whereas the CNN accepts a single input. The CNN takes the single input data and each of the outputs is forwarded to the LSTM model, which classifies the activity. The Hybrid model achieves 97.89% accuracy with the new feature selection methods, whereas the CNN and LSTM individually produce 92.77% and 92.80% accuracy.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Others
Publication year
2023
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Information sciences
ISSN
0020-0255
e-ISSN
1872-6291
Volume of the periodical
628
Issue of the periodical within the volume
MAY
Country of publishing house
US - UNITED STATES
Number of pages
16
Pages from-to
542-557
UT code for WoS article
000942549000001
EID of the result in the Scopus database
2-s2.0-85147927455