All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Performance analysis of single-query 6-DoF camera pose estimation in self-driving setups

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F19%3A00334761" target="_blank" >RIV/68407700:21230/19:00334761 - isvavai.cz</a>

  • Result on the web

    <a href="http://hdl.handle.net/10467/85606" target="_blank" >http://hdl.handle.net/10467/85606</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1016/j.cviu.2019.04.009" target="_blank" >10.1016/j.cviu.2019.04.009</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Performance analysis of single-query 6-DoF camera pose estimation in self-driving setups

  • Original language description

    In this work, we consider the problem of single-query 6-DoF camera pose estimation, i.e. estimating the position and orientation of a camera by using reference images and a point cloud. We perform a systematic comparison of three state-of-the-art strategies for 6-DoF camera pose estimation: feature-based, photometric-based and mutual-information-based approaches. Two standard datasets with self-driving setups are used for experiments, and the performance of the studied methods is evaluated in terms of success rate, translation error and maximum orientation error. Building on the analysis of the results, we evaluate a hybrid approach that combines feature-based and mutual-information-based pose estimation methods to benefit from their complementary properties for pose estimation. Experiments show that (1) in cases with large appearance change between query and reference, the hybrid approach outperforms feature-based and mutual-information-based approaches by an average increment of 9.4% and 8.7% in the success rate, respectively; (2) in cases where query and reference images are captured at similar imaging conditions, the hybrid approach performs similarly as the feature-based approach, but outperforms both photometric-based and mutual-informationbased approaches with a clear margin; (3) the feature-based approach is consistently more accurate than mutual-information-based and photometric-based approaches when at least 4 consistent matching points are found between the query and reference images.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/GA18-05360S" target="_blank" >GA18-05360S: Solving inverse problems for the analysis of fast moving objects</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2019

  • Confidentiality

    C - Předmět řešení projektu podléhá obchodnímu tajemství (§ 504 Občanského zákoníku), ale název projektu, cíle projektu a u ukončeného nebo zastaveného projektu zhodnocení výsledku řešení projektu (údaje P03, P04, P15, P19, P29, PN8) dodané do CEP, jsou upraveny tak, aby byly zveřejnitelné.

Data specific for result type

  • Name of the periodical

    Computer Vision and Image Understanding

  • ISSN

    1077-3142

  • e-ISSN

    1090-235X

  • Volume of the periodical

    186

  • Issue of the periodical within the volume

    Septamber

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    16

  • Pages from-to

    58-73

  • UT code for WoS article

    000481564600006

  • EID of the result in the Scopus database

    2-s2.0-85067195521