Efficient Neighbourhood Consensus Networks via Submanifold Sparse Convolutions
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21730%2F20%3A00347821" target="_blank" >RIV/68407700:21730/20:00347821 - isvavai.cz</a>
Výsledek na webu
<a href="https://www.springerprofessional.de/en/efficient-neighbourhood-consensus-networks-via-submanifold-spars/18555516" target="_blank" >https://www.springerprofessional.de/en/efficient-neighbourhood-consensus-networks-via-submanifold-spars/18555516</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/978-3-030-58545-7_35" target="_blank" >10.1007/978-3-030-58545-7_35</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Efficient Neighbourhood Consensus Networks via Submanifold Sparse Convolutions
Popis výsledku v původním jazyce
In this work we target the problem of estimating accurately localized correspondences between a pair of images. We adopt the recent Neighbourhood Consensus Networks that have demonstrated promising performance for difficult correspondence problems and propose modifications to overcome their main limitations: large memory consumption, large inference time and poorly localized correspondences. Our proposed modifications can reduce the memory footprint and execution time more than 10x, with equivalent results. This is achieved by sparsifying the correlation tensor containing tentative matches, and its subsequent processing with a 4D CNN using submanifold sparse convolutions. localization accuracy is significantly improved by processing the input images in higher resolution, which is possible due to the reduced memory footprint, and by a novel two-stage correspondence relocalization module. The proposed Sparse-NCNet method obtains state-of-the-art results on the HPatches Sequences and InLoc visual localization benchmarks, and competitive results on the Aachen Day-Night benchmark.
Název v anglickém jazyce
Efficient Neighbourhood Consensus Networks via Submanifold Sparse Convolutions
Popis výsledku anglicky
In this work we target the problem of estimating accurately localized correspondences between a pair of images. We adopt the recent Neighbourhood Consensus Networks that have demonstrated promising performance for difficult correspondence problems and propose modifications to overcome their main limitations: large memory consumption, large inference time and poorly localized correspondences. Our proposed modifications can reduce the memory footprint and execution time more than 10x, with equivalent results. This is achieved by sparsifying the correlation tensor containing tentative matches, and its subsequent processing with a 4D CNN using submanifold sparse convolutions. localization accuracy is significantly improved by processing the input images in higher resolution, which is possible due to the reduced memory footprint, and by a novel two-stage correspondence relocalization module. The proposed Sparse-NCNet method obtains state-of-the-art results on the HPatches Sequences and InLoc visual localization benchmarks, and competitive results on the Aachen Day-Night benchmark.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/EF15_003%2F0000468" target="_blank" >EF15_003/0000468: Inteligentní strojové vnímání</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2020
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Computer Vision – ECCV 2020, part IX
ISBN
978-3-030-58544-0
ISSN
0302-9743
e-ISSN
1611-3349
Počet stran výsledku
17
Strana od-do
605-621
Název nakladatele
Springer
Místo vydání
Cham
Místo konání akce
Glasgow
Datum konání akce
23. 8. 2020
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—