Human Detection in Depth Map Created from Point Cloud
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26620%2F22%3APU141816" target="_blank" >RIV/00216305:26620/22:PU141816 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1007/978-3-030-98260-7_16" target="_blank" >http://dx.doi.org/10.1007/978-3-030-98260-7_16</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/978-3-030-98260-7_16" target="_blank" >10.1007/978-3-030-98260-7_16</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Human Detection in Depth Map Created from Point Cloud
Popis výsledku v původním jazyce
This paper deals with human detection in the LiDAR data using the YOLO object detection neural network architecture. RGB-based object detection is the most studied topic in the field of neural networks and autonomous agents. However, these models are very sensitive to even minor changes in the weather or light conditions if the training data do not cover these situations. This paper proposes to use the LiDAR data as a redundant, and more condition invariant source of object detections around the autonomous agent. We used the publically available real-traffic dataset that simultaneously captures data from RGB camera and 3D LiDAR sensors during the clear-sky day and rainy night time and we aggregate the LiDAR data for a short period to increase the density of the point cloud. Later we projected these point cloud by several projection models, like pinhole camera model, cylindrical projection, and bird-view projection, into the 2D image frame, and we annotated all the images. As the main experiment, we trained the several YOLOv5 neural networks on the data captured during the day and validate the models on the mixed day and night data to study the robustness and information gain during the condition changes of the input data. The results show that the LiDAR-based models provide significantly better performance during the changed weather conditions than the RGB-based models.
Název v anglickém jazyce
Human Detection in Depth Map Created from Point Cloud
Popis výsledku anglicky
This paper deals with human detection in the LiDAR data using the YOLO object detection neural network architecture. RGB-based object detection is the most studied topic in the field of neural networks and autonomous agents. However, these models are very sensitive to even minor changes in the weather or light conditions if the training data do not cover these situations. This paper proposes to use the LiDAR data as a redundant, and more condition invariant source of object detections around the autonomous agent. We used the publically available real-traffic dataset that simultaneously captures data from RGB camera and 3D LiDAR sensors during the clear-sky day and rainy night time and we aggregate the LiDAR data for a short period to increase the density of the point cloud. Later we projected these point cloud by several projection models, like pinhole camera model, cylindrical projection, and bird-view projection, into the 2D image frame, and we annotated all the images. As the main experiment, we trained the several YOLOv5 neural networks on the data captured during the day and validate the models on the mixed day and night data to study the robustness and information gain during the condition changes of the input data. The results show that the LiDAR-based models provide significantly better performance during the changed weather conditions than the RGB-based models.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
20205 - Automation and control systems
Návaznosti výsledku
Projekt
<a href="/cs/project/8A20002" target="_blank" >8A20002: Trustable architectures with acceptable residual risk for the electric, connected and automated cars</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
International Conference on Modelling and Simulation for Autonomous Systems
ISBN
9783030982607
ISSN
0302-9743
e-ISSN
—
Počet stran výsledku
12
Strana od-do
1-12
Název nakladatele
Neuveden
Místo vydání
neuveden
Místo konání akce
virtual
Datum konání akce
13. 10. 2021
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000787774900016