OptInOpt: Dual Optimization for Automatic Camera Calibration by Multi-Target Observations
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F19%3APU134956" target="_blank" >RIV/00216305:26230/19:PU134956 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1109/AVSS.2019.8909905" target="_blank" >http://dx.doi.org/10.1109/AVSS.2019.8909905</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/AVSS.2019.8909905" target="_blank" >10.1109/AVSS.2019.8909905</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
OptInOpt: Dual Optimization for Automatic Camera Calibration by Multi-Target Observations
Popis výsledku v původním jazyce
In this paper, we propose a new approach to automatic calibration of surveillance cameras. The proposed method is based on observing rigid objects in the scene and automatically estimating landmarks on these objects. The proposed approach can use arbitrary rigid objects, as was verified by experiments with a synthetic dataset, but vehicles were used during our experiments with real-life data. Landmarks on objects automatically detected by a convolutional neural network together with corresponding 3D positions in the object coordinate system are exploited during the camera calibration process. To determine 3D positions of the landmarks, fine-grained classification of the detected vehicles in the image plane is necessary. The proposed calibration method consists of dual optimization - optimization of objects positions in the world coordinate system and also optimization of the calibration parameters to minimize the re-projection error of the localized landmarks. The experiments show improvement in calibration accuracy over the existing method solving a similar problem furthermore with fewer restrictions on the input data. The calibration error on a real world dataset decreased from 6.88 % to 2.85 %.
Název v anglickém jazyce
OptInOpt: Dual Optimization for Automatic Camera Calibration by Multi-Target Observations
Popis výsledku anglicky
In this paper, we propose a new approach to automatic calibration of surveillance cameras. The proposed method is based on observing rigid objects in the scene and automatically estimating landmarks on these objects. The proposed approach can use arbitrary rigid objects, as was verified by experiments with a synthetic dataset, but vehicles were used during our experiments with real-life data. Landmarks on objects automatically detected by a convolutional neural network together with corresponding 3D positions in the object coordinate system are exploited during the camera calibration process. To determine 3D positions of the landmarks, fine-grained classification of the detected vehicles in the image plane is necessary. The proposed calibration method consists of dual optimization - optimization of objects positions in the world coordinate system and also optimization of the calibration parameters to minimize the re-projection error of the localized landmarks. The experiments show improvement in calibration accuracy over the existing method solving a similar problem furthermore with fewer restrictions on the input data. The calibration error on a real world dataset decreased from 6.88 % to 2.85 %.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
Výsledek vznikl pri realizaci vícero projektů. Více informací v záložce Projekty.
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2019
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
16th IEEE International Conference on Advanced Video and Signal-based Surveillance
ISBN
978-1-7281-0990-9
ISSN
—
e-ISSN
—
Počet stran výsledku
8
Strana od-do
1-8
Název nakladatele
Institute of Electrical and Electronics Engineers
Místo vydání
Taipei
Místo konání akce
Taipei
Datum konání akce
18. 9. 2019
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000524684300085