Vše

Co hledáte?

Vše
Projekty
Výsledky výzkumu
Subjekty

Rychlé hledání

  • Projekty podpořené TA ČR
  • Významné projekty
  • Projekty s nejvyšší státní podporou
  • Aktuálně běžící projekty

Chytré vyhledávání

  • Takto najdu konkrétní +slovo
  • Takto z výsledků -slovo zcela vynechám
  • “Takto můžu najít celou frázi”

Lightweight All-Focused Light Field Rendering

Identifikátory výsledku

  • Kód výsledku v IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F24%3APU151479" target="_blank" >RIV/00216305:26230/24:PU151479 - isvavai.cz</a>

  • Výsledek na webu

    <a href="https://www.sciencedirect.com/science/article/abs/pii/S1077314224001127" target="_blank" >https://www.sciencedirect.com/science/article/abs/pii/S1077314224001127</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1016/j.cviu.2024.104031" target="_blank" >10.1016/j.cviu.2024.104031</a>

Alternativní jazyky

  • Jazyk výsledku

    angličtina

  • Název v původním jazyce

    Lightweight All-Focused Light Field Rendering

  • Popis výsledku v původním jazyce

    This paper proposes a novel real-time method for high-quality view interpolation from light field. The proposal is a lightweight method, which can be used with consumer GPU, reaching same or better quality than existing methods, in a shorter time, with significantly smaller memory requirements. Light field belongs to image-based rendering methods that can produce realistic images without computationally demanding algorithms. The novel view is synthesized from multiple input images of the same scene, captured at different camera positions. Standard rendering techniques, such as rasterization or ray-tracing, are limited in terms of quality, memory footprint, and speed. Light field rendering methods often produce unwanted artifacts resembling ghosting or blur in certain parts of the scene due to unknown geometry of the scene. The proposed method estimates the geometry for each pixel as an optimal focusing distance to mitigate the artifacts. The focusing distance determines which pixels from the input images are mixed to produce the final view. State-of-the-art methods use a constant-step pixel matching scan that iterates over a range of focusing distances. The scan searches for a distance with the smallest color dispersion of the contributing pixels, assuming that they belong to the same spot in the scene. The paper proposes an optimal scanning strategy of the focusing range, an improved color dispersion metric, and other minor improvements, such as sampling block size adjustment, out-of-bounds sampling, and filtering. Experimental results show that the proposal uses less resources, achieves better visual quality, and is significantly faster than existing light field rendering methods. The proposal is 8× faster than the methods in the same category. The proposal uses only four closest views from the light field data and reduces the necessary data transfer. Existing methods often require the full light field grid, which is typically 8×8 images l

  • Název v anglickém jazyce

    Lightweight All-Focused Light Field Rendering

  • Popis výsledku anglicky

    This paper proposes a novel real-time method for high-quality view interpolation from light field. The proposal is a lightweight method, which can be used with consumer GPU, reaching same or better quality than existing methods, in a shorter time, with significantly smaller memory requirements. Light field belongs to image-based rendering methods that can produce realistic images without computationally demanding algorithms. The novel view is synthesized from multiple input images of the same scene, captured at different camera positions. Standard rendering techniques, such as rasterization or ray-tracing, are limited in terms of quality, memory footprint, and speed. Light field rendering methods often produce unwanted artifacts resembling ghosting or blur in certain parts of the scene due to unknown geometry of the scene. The proposed method estimates the geometry for each pixel as an optimal focusing distance to mitigate the artifacts. The focusing distance determines which pixels from the input images are mixed to produce the final view. State-of-the-art methods use a constant-step pixel matching scan that iterates over a range of focusing distances. The scan searches for a distance with the smallest color dispersion of the contributing pixels, assuming that they belong to the same spot in the scene. The paper proposes an optimal scanning strategy of the focusing range, an improved color dispersion metric, and other minor improvements, such as sampling block size adjustment, out-of-bounds sampling, and filtering. Experimental results show that the proposal uses less resources, achieves better visual quality, and is significantly faster than existing light field rendering methods. The proposal is 8× faster than the methods in the same category. The proposal uses only four closest views from the light field data and reduces the necessary data transfer. Existing methods often require the full light field grid, which is typically 8×8 images l

Klasifikace

  • Druh

    J<sub>imp</sub> - Článek v periodiku v databázi Web of Science

  • CEP obor

  • OECD FORD obor

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Návaznosti výsledku

  • Projekt

    <a href="/cs/project/8A21015" target="_blank" >8A21015: AI-augmented automation for efficient DevOps, a model-based framework for continuous development At RunTime in cyber-physical systems</a><br>

  • Návaznosti

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach

Ostatní

  • Rok uplatnění

    2024

  • Kód důvěrnosti údajů

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Údaje specifické pro druh výsledku

  • Název periodika

    COMPUTER VISION AND IMAGE UNDERSTANDING

  • ISSN

    1077-3142

  • e-ISSN

    1090-235X

  • Svazek periodika

    244

  • Číslo periodika v rámci svazku

    7

  • Stát vydavatele periodika

    US - Spojené státy americké

  • Počet stran výsledku

    16

  • Strana od-do

    7-8

  • Kód UT WoS článku

    001238000300001

  • EID výsledku v databázi Scopus

    2-s2.0-85192206398