Self-Localization of Unmanned Aerial Vehicles Based on Optical Flow in Inboard Camera Image
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F18%3A00316982" target="_blank" >RIV/68407700:21230/18:00316982 - isvavai.cz</a>
Výsledek na webu
<a href="https://link.springer.com/book/10.1007%2F978-3-319-76072-8" target="_blank" >https://link.springer.com/book/10.1007%2F978-3-319-76072-8</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/978-3-319-76072-8_8" target="_blank" >10.1007/978-3-319-76072-8_8</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Self-Localization of Unmanned Aerial Vehicles Based on Optical Flow in Inboard Camera Image
Popis výsledku v původním jazyce
This paper proposes and evaluates the implementation of a self-localization system intended for use in Unmanned Aerial Vehicles. Accurate localization is necessary for UAVs for efficient stabilization, navigation and collision avoidance. Conventionally, this requirement is fulfilled using external hardware infrastructure, such as Global Navigation Satellite System or visual motion-capture system. These approaches are, however, not applicable in environments where deployment of cumbersome motion capture equipment is not feasible, as well as in GNSS-denied environments. Systems based on Simultaneous Localization and Mapping (SLAM) require heavy and expensive onboard equipment and high amounts of data transmissions for sharing maps between UAVs. Availability of a system without these drawbacks is crucial for deployment of tight formations of multiple fully autonomous micro UAVs for both outdoor and indoor missions. The project was inspired by the often used sensor PX4FLOW Smart Camera. The aim was to develop a similar sensor, without the drawbacks observed in its use, as well as to make the operation of it more transparent and to make it independent of a specific hardware. Our proposed solution requires only a lightweight camera and a single-point range sensor. It is based on optical flow estimation from consecutive images obtained from downward-facing camera, coupled with a specialized RANSAC-inspired post-processing method that takes into account flight dynamics. This filtering makes it more robust against imperfect lighting, homogenous ground patches, random close objects and spurious errors. These features make this approach suitable even for coordinated flights through demanding forest-like environment. The system is designed mainly for horizontal velocity estimation, but specialized modifications were also made for vertical speed and yaw rotation rate estimation. These methods were tested in a simulator and subsequently in real-world conditions.
Název v anglickém jazyce
Self-Localization of Unmanned Aerial Vehicles Based on Optical Flow in Inboard Camera Image
Popis výsledku anglicky
This paper proposes and evaluates the implementation of a self-localization system intended for use in Unmanned Aerial Vehicles. Accurate localization is necessary for UAVs for efficient stabilization, navigation and collision avoidance. Conventionally, this requirement is fulfilled using external hardware infrastructure, such as Global Navigation Satellite System or visual motion-capture system. These approaches are, however, not applicable in environments where deployment of cumbersome motion capture equipment is not feasible, as well as in GNSS-denied environments. Systems based on Simultaneous Localization and Mapping (SLAM) require heavy and expensive onboard equipment and high amounts of data transmissions for sharing maps between UAVs. Availability of a system without these drawbacks is crucial for deployment of tight formations of multiple fully autonomous micro UAVs for both outdoor and indoor missions. The project was inspired by the often used sensor PX4FLOW Smart Camera. The aim was to develop a similar sensor, without the drawbacks observed in its use, as well as to make the operation of it more transparent and to make it independent of a specific hardware. Our proposed solution requires only a lightweight camera and a single-point range sensor. It is based on optical flow estimation from consecutive images obtained from downward-facing camera, coupled with a specialized RANSAC-inspired post-processing method that takes into account flight dynamics. This filtering makes it more robust against imperfect lighting, homogenous ground patches, random close objects and spurious errors. These features make this approach suitable even for coordinated flights through demanding forest-like environment. The system is designed mainly for horizontal velocity estimation, but specialized modifications were also made for vertical speed and yaw rotation rate estimation. These methods were tested in a simulator and subsequently in real-world conditions.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
20204 - Robotics and automatic control
Návaznosti výsledku
Projekt
<a href="/cs/project/GA16-24206S" target="_blank" >GA16-24206S: Metody informatického plánování cest pro neholonomní mobilní roboty v úlohách monitorování a dohledu</a><br>
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2018
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Modelling and Simulation for Autonomous Systems (MESAS 2017)
ISBN
978-3-319-76071-1
ISSN
0302-9743
e-ISSN
—
Počet stran výsledku
27
Strana od-do
106-132
Název nakladatele
Springer International Publishing AG
Místo vydání
Cham
Místo konání akce
Řím
Datum konání akce
24. 10. 2017
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000444831600008