Error Preserving Correction: A Method for CP Decomposition at a Target Error Bound
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985556%3A_____%2F19%3A00500107" target="_blank" >RIV/67985556:_____/19:00500107 - isvavai.cz</a>
Výsledek na webu
<a href="https://ieeexplore.ieee.org/document/8579207" target="_blank" >https://ieeexplore.ieee.org/document/8579207</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/TSP.2018.2887192" target="_blank" >10.1109/TSP.2018.2887192</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Error Preserving Correction: A Method for CP Decomposition at a Target Error Bound
Popis výsledku v původním jazyce
In CANDECOMP/PARAFAC tensor decomposition, degeneracy often occurs in some difficult scenarios, especially, when the rank exceeds the tensor dimension, or when the loading components are highly collinear in several or all modes, or when CPD does not have an optimal solution. In such cases, norms of some rank-1 tensors become significantly large and cancel each other. This makes algorithms getting stuck in local minima while running a huge number of iterations does not improve the decomposition. In this paper, we propose an error preservation correction method to deal with such problem. Our aim is to seek an alternative tensor, which preserves the approximation error, but norms of rank-1 tensor components of the new tensor are minimized. Alternating and all-at-once correction algorithms have been developed for the problem. In addition, we propose a novel CPD with a bound constraint on the norm of the rank-one tensors. The method can be useful for decomposing tensors that cannot be performed by traditional algorithms. Finally, we demonstrate an application of the proposed method in image denoising and decomposition of the weight tensors in convolutional neural networks.
Název v anglickém jazyce
Error Preserving Correction: A Method for CP Decomposition at a Target Error Bound
Popis výsledku anglicky
In CANDECOMP/PARAFAC tensor decomposition, degeneracy often occurs in some difficult scenarios, especially, when the rank exceeds the tensor dimension, or when the loading components are highly collinear in several or all modes, or when CPD does not have an optimal solution. In such cases, norms of some rank-1 tensors become significantly large and cancel each other. This makes algorithms getting stuck in local minima while running a huge number of iterations does not improve the decomposition. In this paper, we propose an error preservation correction method to deal with such problem. Our aim is to seek an alternative tensor, which preserves the approximation error, but norms of rank-1 tensor components of the new tensor are minimized. Alternating and all-at-once correction algorithms have been developed for the problem. In addition, we propose a novel CPD with a bound constraint on the norm of the rank-one tensors. The method can be useful for decomposing tensors that cannot be performed by traditional algorithms. Finally, we demonstrate an application of the proposed method in image denoising and decomposition of the weight tensors in convolutional neural networks.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10103 - Statistics and probability
Návaznosti výsledku
Projekt
<a href="/cs/project/GA17-00902S" target="_blank" >GA17-00902S: Pokročilé metody slepé separace podprostorů</a><br>
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2019
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
IEEE Transactions on Signal Processing
ISSN
1053-587X
e-ISSN
—
Svazek periodika
67
Číslo periodika v rámci svazku
5
Stát vydavatele periodika
US - Spojené státy americké
Počet stran výsledku
16
Strana od-do
1175-1190
Kód UT WoS článku
000455721400005
EID výsledku v databázi Scopus
2-s2.0-85058883993