Self-Localization of Unmanned Aerial Vehicles Based on Optical Flow in Inboard Camera Image
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F18%3A00316982" target="_blank" >RIV/68407700:21230/18:00316982 - isvavai.cz</a>
Result on the web
<a href="https://link.springer.com/book/10.1007%2F978-3-319-76072-8" target="_blank" >https://link.springer.com/book/10.1007%2F978-3-319-76072-8</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/978-3-319-76072-8_8" target="_blank" >10.1007/978-3-319-76072-8_8</a>
Alternative languages
Result language
angličtina
Original language name
Self-Localization of Unmanned Aerial Vehicles Based on Optical Flow in Inboard Camera Image
Original language description
This paper proposes and evaluates the implementation of a self-localization system intended for use in Unmanned Aerial Vehicles. Accurate localization is necessary for UAVs for efficient stabilization, navigation and collision avoidance. Conventionally, this requirement is fulfilled using external hardware infrastructure, such as Global Navigation Satellite System or visual motion-capture system. These approaches are, however, not applicable in environments where deployment of cumbersome motion capture equipment is not feasible, as well as in GNSS-denied environments. Systems based on Simultaneous Localization and Mapping (SLAM) require heavy and expensive onboard equipment and high amounts of data transmissions for sharing maps between UAVs. Availability of a system without these drawbacks is crucial for deployment of tight formations of multiple fully autonomous micro UAVs for both outdoor and indoor missions. The project was inspired by the often used sensor PX4FLOW Smart Camera. The aim was to develop a similar sensor, without the drawbacks observed in its use, as well as to make the operation of it more transparent and to make it independent of a specific hardware. Our proposed solution requires only a lightweight camera and a single-point range sensor. It is based on optical flow estimation from consecutive images obtained from downward-facing camera, coupled with a specialized RANSAC-inspired post-processing method that takes into account flight dynamics. This filtering makes it more robust against imperfect lighting, homogenous ground patches, random close objects and spurious errors. These features make this approach suitable even for coordinated flights through demanding forest-like environment. The system is designed mainly for horizontal velocity estimation, but specialized modifications were also made for vertical speed and yaw rotation rate estimation. These methods were tested in a simulator and subsequently in real-world conditions.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
20204 - Robotics and automatic control
Result continuities
Project
<a href="/en/project/GA16-24206S" target="_blank" >GA16-24206S: Efficient Information Gathering with Dubins Vehicles in Persistent Monitoring and Surveillance Missions</a><br>
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2018
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Modelling and Simulation for Autonomous Systems (MESAS 2017)
ISBN
978-3-319-76071-1
ISSN
0302-9743
e-ISSN
—
Number of pages
27
Pages from-to
106-132
Publisher name
Springer International Publishing AG
Place of publication
Cham
Event location
Řím
Event date
Oct 24, 2017
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
000444831600008