Navigation Without Localisation: Reliable Teach and Repeat Based on the Convergence Theorem
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F18%3A00328597" target="_blank" >RIV/68407700:21230/18:00328597 - isvavai.cz</a>
Výsledek na webu
<a href="https://ieeexplore.ieee.org/document/8593803" target="_blank" >https://ieeexplore.ieee.org/document/8593803</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/IROS.2018.8593803" target="_blank" >10.1109/IROS.2018.8593803</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Navigation Without Localisation: Reliable Teach and Repeat Based on the Convergence Theorem
Popis výsledku v původním jazyce
We present a novel concept for teach-and-repeat visual navigation. The proposed concept is based on a mathematical model, which indicates that in teach-and-repeat navigation scenarios, mobile robots do not need to perform explicit localisation. Rather than that, a mobile robot which repeats a previously taught path can simply “replay” the learned velocities, while using its camera information only to correct its heading relative to the intended path. To support our claim, we establish a position error model of a robot, which traverses a taught path by only correcting its heading. Then, we outline a mathematical proof which shows that this position error does not diverge over time. Based on the insights from the model, we present a simple monocular teach-and-repeat navigation method. The method is computationally efficient, it does not require camera calibration, and it can learn and autonomously traverse arbitrarily-shaped paths. In a series of experiments, we demonstrate that the method can reliably guide mobile robots in realistic indoor and outdoor conditions, and can cope with imperfect odometry, landmark deficiency, illumination variations and naturally-occurring environment changes. Furthermore, we provide the navigation system and the datasets gathered at www.github.com/gestom/stroll_bearnav.
Název v anglickém jazyce
Navigation Without Localisation: Reliable Teach and Repeat Based on the Convergence Theorem
Popis výsledku anglicky
We present a novel concept for teach-and-repeat visual navigation. The proposed concept is based on a mathematical model, which indicates that in teach-and-repeat navigation scenarios, mobile robots do not need to perform explicit localisation. Rather than that, a mobile robot which repeats a previously taught path can simply “replay” the learned velocities, while using its camera information only to correct its heading relative to the intended path. To support our claim, we establish a position error model of a robot, which traverses a taught path by only correcting its heading. Then, we outline a mathematical proof which shows that this position error does not diverge over time. Based on the insights from the model, we present a simple monocular teach-and-repeat navigation method. The method is computationally efficient, it does not require camera calibration, and it can learn and autonomously traverse arbitrarily-shaped paths. In a series of experiments, we demonstrate that the method can reliably guide mobile robots in realistic indoor and outdoor conditions, and can cope with imperfect odometry, landmark deficiency, illumination variations and naturally-occurring environment changes. Furthermore, we provide the navigation system and the datasets gathered at www.github.com/gestom/stroll_bearnav.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/GJ17-27006Y" target="_blank" >GJ17-27006Y: Prostorově temporální representace pro dlouhodobou navigaci mobilních robotů</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2018
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
ISBN
978-1-5386-8094-0
ISSN
2153-0858
e-ISSN
2153-0866
Počet stran výsledku
8
Strana od-do
1657-1664
Název nakladatele
IEEE Press
Místo vydání
New York
Místo konání akce
Madrid
Datum konání akce
1. 10. 2018
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000458872701112