All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Stable Low-Rank Tensor Decomposition for Compression of Convolutional Neural Network

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985556%3A_____%2F20%3A00534541" target="_blank" >RIV/67985556:_____/20:00534541 - isvavai.cz</a>

  • Result on the web

    <a href="http://dx.doi.org/10.1007/978-3-030-58526-6_31" target="_blank" >http://dx.doi.org/10.1007/978-3-030-58526-6_31</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1007/978-3-030-58526-6_31" target="_blank" >10.1007/978-3-030-58526-6_31</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Stable Low-Rank Tensor Decomposition for Compression of Convolutional Neural Network

  • Original language description

    Most state-of-the-art deep neural networks are overparameterized and exhibit a high computational cost. A straightforward approach to this problem is to replace convolutional kernels with its low-rank tensor approximations, whereas the Canonical Polyadic tensor Decomposition is one of the most suited models. However, fitting the convolutional tensors by numerical optimization algorithms often encounters diverging components, i.e.,extremely large rank-one tensors but canceling each other. Such degeneracy often causes the non-interpretable result and numerical instability for the neural network ne-tuning. This paper is the first study on degeneracy in the tensor decomposition of convolutional kernels. We present a novel method, which can stabilize the low-rank approximation of convolutional kernels and ensure efficient compression while preserving the high quality performance of the neural networks. We evaluate our approach on popular CNN architectures for image classification and show that our method results in much lower accuracy degradation and provides consistent performance.n

  • Czech name

  • Czech description

Classification

  • Type

    D - Article in proceedings

  • CEP classification

  • OECD FORD branch

    20201 - Electrical and electronic engineering

Result continuities

  • Project

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2020

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Article name in the collection

    ECCV 2020

  • ISBN

    978-3-030-58525-9

  • ISSN

    0302-9743

  • e-ISSN

    1611-3349

  • Number of pages

    18

  • Pages from-to

    522-539

  • Publisher name

    Springer Nature Switzerland AG 2020

  • Place of publication

    Cham

  • Event location

    Glasgow

  • Event date

    Aug 23, 2020

  • Type of event by nationality

    WRD - Celosvětová akce

  • UT code for WoS article