All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Facial Expression Recognition Using Local Gravitational Force Descriptor-Based Deep Convolution Neural Networks

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F62690094%3A18450%2F21%3A50017579" target="_blank" >RIV/62690094:18450/21:50017579 - isvavai.cz</a>

  • Result on the web

    <a href="https://ieeexplore.ieee.org/document/9226437" target="_blank" >https://ieeexplore.ieee.org/document/9226437</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1109/TIM.2020.3031835" target="_blank" >10.1109/TIM.2020.3031835</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Facial Expression Recognition Using Local Gravitational Force Descriptor-Based Deep Convolution Neural Networks

  • Original language description

    An image is worth a thousand words; hence, a face image illustrates extensive details about the specification, gender, age, and emotional states of mind. Facial expressions play an important role in community-based interactions and are often used in the behavioral analysis of emotions. Recognition of automatic facial expressions from a facial image is a challenging task in the computer vision community and admits a large set of applications, such as driver safety, human-computer interactions, health care, behavioral science, video conferencing, cognitive science, and others. In this work, a deep-learning-based scheme is proposed for identifying the facial expression of a person. The proposed method consists of two parts. The former one finds out local features from face images using a local gravitational force descriptor, while, in the latter part, the descriptor is fed into a novel deep convolution neural network (DCNN) model. The proposed DCNN has two branches. The first branch explores geometric features, such as edges, curves, and lines, whereas holistic features are extracted by the second branch. Finally, the score-level fusion technique is adopted to compute the final classification score. The proposed method along with 25 state-of-the-art methods is implemented on five benchmark available databases, namely, Facial Expression Recognition 2013, Japanese Female Facial Expressions, Extended CohnKanade, Karolinska Directed Emotional Faces, and Real-world Affective Faces. The databases consist of seven basic emotions: neutral, happiness, anger, sadness, fear, disgust, and surprise. The proposed method is compared with existing approaches using four evaluation metrics, namely, accuracy, precision, recall, and f1-score. The obtained results demonstrate that the proposed method outperforms all state-of-the-art methods on all the databases. © 2020 IEEE.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    20201 - Electrical and electronic engineering

Result continuities

  • Project

    <a href="/en/project/EF18_069%2F0010054" target="_blank" >EF18_069/0010054: IT4Neuro(degeneration)</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2021

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    IEEE Transactions on Instrumentation and Measurement

  • ISSN

    0018-9456

  • e-ISSN

  • Volume of the periodical

    70

  • Issue of the periodical within the volume

    JANUARY

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    13

  • Pages from-to

    "Article Number: 5003512"

  • UT code for WoS article

    000691803600005

  • EID of the result in the Scopus database

    2-s2.0-85098583090