Deep Neural Network for Precision Landing and Variable Flight Planning of Autonomous UAV
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26220%2F21%3APU146056" target="_blank" >RIV/00216305:26220/21:PU146056 - isvavai.cz</a>
Výsledek na webu
<a href="https://ieeexplore.ieee.org/document/9694683" target="_blank" >https://ieeexplore.ieee.org/document/9694683</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/PIERS53385.2021.9694683" target="_blank" >10.1109/PIERS53385.2021.9694683</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Deep Neural Network for Precision Landing and Variable Flight Planning of Autonomous UAV
Popis výsledku v původním jazyce
The article is focused on autonomous unmanned aerial vehicle control for precise guidance to the ground landing target with variable creation of another flight plan. Object recognition is performed in real-time by a neural network using a camera located on Unmanned Aerial Vehicle (UAV). Object recognition is performed in the ground station with which the aircraft maintains a communication channel. The ground station computer evaluates the relative position of the aircraft with the position of the monitored landing field in the field of view of the image and after successful detection sends back flight instructions to the aircraft control unit. The neural network is pre-trained on landing patterns carrying additionally encoded information with flight instructions about the next waypoints of the flight plan according to which the drone performs an autonomous flight. The created neural network thus serves not only for precise landing, but also for finding the following points of the flight plan for a given aircraft.
Název v anglickém jazyce
Deep Neural Network for Precision Landing and Variable Flight Planning of Autonomous UAV
Popis výsledku anglicky
The article is focused on autonomous unmanned aerial vehicle control for precise guidance to the ground landing target with variable creation of another flight plan. Object recognition is performed in real-time by a neural network using a camera located on Unmanned Aerial Vehicle (UAV). Object recognition is performed in the ground station with which the aircraft maintains a communication channel. The ground station computer evaluates the relative position of the aircraft with the position of the monitored landing field in the field of view of the image and after successful detection sends back flight instructions to the aircraft control unit. The neural network is pre-trained on landing patterns carrying additionally encoded information with flight instructions about the next waypoints of the flight plan according to which the drone performs an autonomous flight. The created neural network thus serves not only for precise landing, but also for finding the following points of the flight plan for a given aircraft.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
20205 - Automation and control systems
Návaznosti výsledku
Projekt
—
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2021
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
2021 Photonics & Electromagnetics Research Symposium (PIERS)
ISBN
978-1-7281-7247-7
ISSN
1559-9450
e-ISSN
—
Počet stran výsledku
5
Strana od-do
2243-2247
Název nakladatele
IEEE
Místo vydání
NEW YORK
Místo konání akce
Hangzhou, China
Datum konání akce
21. 11. 2021
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000795902300370