Robust Visual Teach and Repeat Navigation for Unmanned Aerial Vehicles
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F21%3A00352831" target="_blank" >RIV/68407700:21230/21:00352831 - isvavai.cz</a>
Nalezeny alternativní kódy
RIV/68407700:21730/21:00352831
Výsledek na webu
<a href="https://doi.org/10.1109/ECMR50962.2021.9568807" target="_blank" >https://doi.org/10.1109/ECMR50962.2021.9568807</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/ECMR50962.2021.9568807" target="_blank" >10.1109/ECMR50962.2021.9568807</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Robust Visual Teach and Repeat Navigation for Unmanned Aerial Vehicles
Popis výsledku v původním jazyce
Vision-based navigation is one of the leading tasks in mobile robotics. It, however, introduces additional challenges in long-term autonomy due to its reliance on stable visual features. As such, visual navigation methods are often sensitive to appearance changes and unreliable in environments with low feature density. We present a teach-and-repeat navigation system for unmanned aerial vehicles (UAVs) equipped with a low-end camera. We use a novel visual place recognition methodology based on high-level CNN features to localize a robot on a previously traversed trajectory and to directly calculate heading corrections for navigation. The developed navigation method is fully vision-based and independent of other sensory information, making it universal and easily transferable. The system has been experimentally verified and evaluated with respect to a state-of-the-art ORB2-TaR navigation system. It showed comparable results in terms of its precision and robustness to environmental changes. In addition, the system was able to safely navigate in environments with low feature density and to reliably solve the wake-up robot problem.
Název v anglickém jazyce
Robust Visual Teach and Repeat Navigation for Unmanned Aerial Vehicles
Popis výsledku anglicky
Vision-based navigation is one of the leading tasks in mobile robotics. It, however, introduces additional challenges in long-term autonomy due to its reliance on stable visual features. As such, visual navigation methods are often sensitive to appearance changes and unreliable in environments with low feature density. We present a teach-and-repeat navigation system for unmanned aerial vehicles (UAVs) equipped with a low-end camera. We use a novel visual place recognition methodology based on high-level CNN features to localize a robot on a previously traversed trajectory and to directly calculate heading corrections for navigation. The developed navigation method is fully vision-based and independent of other sensory information, making it universal and easily transferable. The system has been experimentally verified and evaluated with respect to a state-of-the-art ORB2-TaR navigation system. It showed comparable results in terms of its precision and robustness to environmental changes. In addition, the system was able to safely navigate in environments with low feature density and to reliably solve the wake-up robot problem.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/EF15_003%2F0000470" target="_blank" >EF15_003/0000470: Robotika pro Průmysl 4.0</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2021
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Proceedings of the 10th European Conference on Mobile Robots
ISBN
978-1-6654-1213-1
ISSN
—
e-ISSN
—
Počet stran výsledku
7
Strana od-do
—
Název nakladatele
IEEE
Místo vydání
Brussels
Místo konání akce
Bonn
Datum konání akce
31. 8. 2021
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—