Image Matching across Wide Baselines: From Paper to Practice
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F21%3A00342509" target="_blank" >RIV/68407700:21230/21:00342509 - isvavai.cz</a>
Výsledek na webu
<a href="https://doi.org/10.1007/s11263-020-01385-0" target="_blank" >https://doi.org/10.1007/s11263-020-01385-0</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/s11263-020-01385-0" target="_blank" >10.1007/s11263-020-01385-0</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Image Matching across Wide Baselines: From Paper to Practice
Popis výsledku v původním jazyce
We introduce a comprehensive benchmark for local features and robust estimation algorithms, focusing on the downstream task -- the accuracy of the reconstructed camera pose -- as our primary metric. Our pipeline's modular structure allows easy integration, configuration, and combination of different methods and heuristics. This is demonstrated by embedding dozens of popular algorithms and evaluating them, from seminal works to the cutting edge of machine learning research. We show that with proper settings, classical solutions may still outperform the perceived state of the art. Besides establishing the actual state of the art, the conducted experiments reveal unexpected properties of Structure from Motion (SfM) pipelines that can help improve their performance, for both algorithmic and learned methods. Data and code are online https://github.com/team-yi-ubc/image-matching-benchmark providing an easy-to-use and flexible framework for the benchmarking of local features and robust estimation methods, both alongside and against top-performing methods. This work provides a basis for the Image Matching Challenge https://vision.uvic.ca/image-matching-challenge/.
Název v anglickém jazyce
Image Matching across Wide Baselines: From Paper to Practice
Popis výsledku anglicky
We introduce a comprehensive benchmark for local features and robust estimation algorithms, focusing on the downstream task -- the accuracy of the reconstructed camera pose -- as our primary metric. Our pipeline's modular structure allows easy integration, configuration, and combination of different methods and heuristics. This is demonstrated by embedding dozens of popular algorithms and evaluating them, from seminal works to the cutting edge of machine learning research. We show that with proper settings, classical solutions may still outperform the perceived state of the art. Besides establishing the actual state of the art, the conducted experiments reveal unexpected properties of Structure from Motion (SfM) pipelines that can help improve their performance, for both algorithmic and learned methods. Data and code are online https://github.com/team-yi-ubc/image-matching-benchmark providing an easy-to-use and flexible framework for the benchmarking of local features and robust estimation methods, both alongside and against top-performing methods. This work provides a basis for the Image Matching Challenge https://vision.uvic.ca/image-matching-challenge/.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/EF16_019%2F0000765" target="_blank" >EF16_019/0000765: Výzkumné centrum informatiky</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2021
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
International Journal of Computer Vision
ISSN
0920-5691
e-ISSN
1573-1405
Svazek periodika
129
Číslo periodika v rámci svazku
February
Stát vydavatele periodika
NL - Nizozemsko
Počet stran výsledku
31
Strana od-do
517-547
Kód UT WoS článku
000577443300001
EID výsledku v databázi Scopus
2-s2.0-85092220167