Human Localization in Robotized Warehouses based on Stereo Odometry and Ground-Marker Fusion
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21730%2F22%3A00351935" target="_blank" >RIV/68407700:21730/22:00351935 - isvavai.cz</a>
Result on the web
<a href="https://doi.org/10.1016/j.rcim.2021.102241" target="_blank" >https://doi.org/10.1016/j.rcim.2021.102241</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1016/j.rcim.2021.102241" target="_blank" >10.1016/j.rcim.2021.102241</a>
Alternative languages
Result language
angličtina
Original language name
Human Localization in Robotized Warehouses based on Stereo Odometry and Ground-Marker Fusion
Original language description
Modern logistic solutions for large warehouses consist of a fleet of robots that transfer goods, move racks, and perform other physically difficult and repetitive tasks. The shopfloor is usually enclosed with a safety fence and if a human needs to enter the warehouse all the robots are stopped, as opposed to only the ones in the most immediate vicinity of the human, thus significantly limiting the warehouse efficiency. To tackle this challenge, an integrated safety system is needed with human localization as one of its essential components. In this paper, we propose a novel human localization method for robotized warehouses that is based on a suite of wearable visual sensors installed on a vest worn by humans. The proposed method does not require any modifications of the warehouse environment and relies on the already existing infrastructure. Specifically, we estimate the human location by fusing stereo visual-inertial odometry data and distances to the known absolute poses of the detected ground-markers which robots use for their localization. Fusion is performed by building a pose graph, where we treat estimated human poses relative to markers as graph nodes and odometry estimates as graph edges. We conducted extensive laboratory and warehouse facility experiments, where we tested the reliability and accuracy of the proposed method and compared its performance to a state-of-the-art visual SLAM solution, namely ORB-SLAM2. The results indicate that our method can track absolute position in real-time and has competitive accuracy with respect to ORB-SLAM2, while ensuring higher localization reliability when faced with structural changes in the environment. Furthermore, we provide publicly the experimental datasets to the research community.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
R - Projekt Ramcoveho programu EK
Others
Publication year
2022
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Robotics and Computer-Integrated Manufacturing
ISSN
0736-5845
e-ISSN
1879-2537
Volume of the periodical
73
Issue of the periodical within the volume
February
Country of publishing house
IE - IRELAND
Number of pages
14
Pages from-to
—
UT code for WoS article
000704359300005
EID of the result in the Scopus database
2-s2.0-85114131314