All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Adaptive reservation of network resources according to video classification scenes

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F61989100%3A27240%2F21%3A10247864" target="_blank" >RIV/61989100:27240/21:10247864 - isvavai.cz</a>

  • Alternative codes found

    RIV/61989100:27740/21:10247864

  • Result on the web

    <a href="https://www.mdpi.com/1424-8220/21/6/1949" target="_blank" >https://www.mdpi.com/1424-8220/21/6/1949</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.3390/s21061949" target="_blank" >10.3390/s21061949</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Adaptive reservation of network resources according to video classification scenes

  • Original language description

    Video quality evaluation needs a combined approach that includes subjective and objective metrics, testing, and monitoring of the network. This paper deals with the novel approach of mapping quality of service (QoS) to quality of experience (QoE) using QoE metrics to determine user satisfaction limits, and applying QoS tools to provide the minimum QoE expected by users. Our aim was to connect objective estimations of video quality with the subjective estimations. A comprehensive tool for the estimation of the subjective evaluation is proposed. This new idea is based on the evaluation and marking of video sequences using the sentinel flag derived from spatial information (SI) and temporal information (TI) in individual video frames. The authors of this paper created a video database for quality evaluation, and derived SI and TI from each video sequence for classifying the scenes. Video scenes from the database were evaluated by objective and subjective assessment. Based on the results, a new model for prediction of subjective quality is defined and presented in this paper. This quality is predicted using an artificial neural network based on the objective evaluation and the type of video sequences defined by qualitative parameters such as resolution, compression standard, and bitstream. Furthermore, the authors created an optimum mapping function to define the threshold for the variable bitrate setting based on the flag in the video, determining the type of scene in the proposed model. This function allows one to allocate a bitrate dynamically for a particular segment of the scene and maintains the desired quality. Our proposed model can help video service providers with the increasing the comfort of the end users. The variable bitstream ensures consistent video quality and customer satisfaction, while network resources are used effectively. The proposed model can also predict the appropriate bitrate based on the required quality of video sequences, defined using either objective or subjective assessment. (C) 2021 by the authors. Licensee MDPI, Basel, Switzerland.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>SC</sub> - Article in a specialist periodical, which is included in the SCOPUS database

  • CEP classification

  • OECD FORD branch

    20203 - Telecommunications

Result continuities

  • Project

  • Continuities

    S - Specificky vyzkum na vysokych skolach

Others

  • Publication year

    2021

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Sensors. Vol. 20

  • ISSN

    1424-8220

  • e-ISSN

  • Volume of the periodical

    21

  • Issue of the periodical within the volume

    6

  • Country of publishing house

    CH - SWITZERLAND

  • Number of pages

    31

  • Pages from-to

    1-31

  • UT code for WoS article

  • EID of the result in the Scopus database

    2-s2.0-85102403772