All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Lightweight All-Focused Light Field Rendering

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F24%3APU151479" target="_blank" >RIV/00216305:26230/24:PU151479 - isvavai.cz</a>

  • Result on the web

    <a href="https://www.sciencedirect.com/science/article/abs/pii/S1077314224001127" target="_blank" >https://www.sciencedirect.com/science/article/abs/pii/S1077314224001127</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1016/j.cviu.2024.104031" target="_blank" >10.1016/j.cviu.2024.104031</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Lightweight All-Focused Light Field Rendering

  • Original language description

    This paper proposes a novel real-time method for high-quality view interpolation from light field. The proposal is a lightweight method, which can be used with consumer GPU, reaching same or better quality than existing methods, in a shorter time, with significantly smaller memory requirements. Light field belongs to image-based rendering methods that can produce realistic images without computationally demanding algorithms. The novel view is synthesized from multiple input images of the same scene, captured at different camera positions. Standard rendering techniques, such as rasterization or ray-tracing, are limited in terms of quality, memory footprint, and speed. Light field rendering methods often produce unwanted artifacts resembling ghosting or blur in certain parts of the scene due to unknown geometry of the scene. The proposed method estimates the geometry for each pixel as an optimal focusing distance to mitigate the artifacts. The focusing distance determines which pixels from the input images are mixed to produce the final view. State-of-the-art methods use a constant-step pixel matching scan that iterates over a range of focusing distances. The scan searches for a distance with the smallest color dispersion of the contributing pixels, assuming that they belong to the same spot in the scene. The paper proposes an optimal scanning strategy of the focusing range, an improved color dispersion metric, and other minor improvements, such as sampling block size adjustment, out-of-bounds sampling, and filtering. Experimental results show that the proposal uses less resources, achieves better visual quality, and is significantly faster than existing light field rendering methods. The proposal is 8× faster than the methods in the same category. The proposal uses only four closest views from the light field data and reduces the necessary data transfer. Existing methods often require the full light field grid, which is typically 8×8 images l

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/8A21015" target="_blank" >8A21015: AI-augmented automation for efficient DevOps, a model-based framework for continuous development At RunTime in cyber-physical systems</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach

Others

  • Publication year

    2024

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    COMPUTER VISION AND IMAGE UNDERSTANDING

  • ISSN

    1077-3142

  • e-ISSN

    1090-235X

  • Volume of the periodical

    244

  • Issue of the periodical within the volume

    7

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    16

  • Pages from-to

    7-8

  • UT code for WoS article

    001238000300001

  • EID of the result in the Scopus database

    2-s2.0-85192206398