Tensor Networks for Latent Variable Analysis: Novel Algorithms for Tensor Train Approximation
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985556%3A_____%2F20%3A00518308" target="_blank" >RIV/67985556:_____/20:00518308 - isvavai.cz</a>
Výsledek na webu
<a href="https://ieeexplore.ieee.org/document/8984730" target="_blank" >https://ieeexplore.ieee.org/document/8984730</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/TNNLS.2019.2956926" target="_blank" >10.1109/TNNLS.2019.2956926</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Tensor Networks for Latent Variable Analysis: Novel Algorithms for Tensor Train Approximation
Popis výsledku v původním jazyce
Decompositions of tensors into factor matrices, which interact through a core tensor, have found numerous applications in signal processing and machine learning. A more general tensor model that represents data as an ordered network of subtensors of order-2 or order-3 has, so far, not been widely considered in these fields, although this so-called tensor network (TN) decomposition has been long studied in quantum physics and scientific computing. In this article, we present novel algorithms and applications of TN decompositions, with a particular focus on the tensor train (TT) decomposition and its variants. The novel algorithms developed for the TT decomposition update, in an alternating way, one or several core tensors at each iteration and exhibit enhanced mathematical tractability and scalability for large-scale data tensors. For rigor, the cases of the given ranks, given approximation error, and the given error bound are all considered. The proposed algorithms provide well-balanced TT-decompositions and are tested in the classic paradigms of blind source separation from a single mixture, denoising, and feature extraction, achieving superior performance over the widely used truncated algorithms for TT decomposition.
Název v anglickém jazyce
Tensor Networks for Latent Variable Analysis: Novel Algorithms for Tensor Train Approximation
Popis výsledku anglicky
Decompositions of tensors into factor matrices, which interact through a core tensor, have found numerous applications in signal processing and machine learning. A more general tensor model that represents data as an ordered network of subtensors of order-2 or order-3 has, so far, not been widely considered in these fields, although this so-called tensor network (TN) decomposition has been long studied in quantum physics and scientific computing. In this article, we present novel algorithms and applications of TN decompositions, with a particular focus on the tensor train (TT) decomposition and its variants. The novel algorithms developed for the TT decomposition update, in an alternating way, one or several core tensors at each iteration and exhibit enhanced mathematical tractability and scalability for large-scale data tensors. For rigor, the cases of the given ranks, given approximation error, and the given error bound are all considered. The proposed algorithms provide well-balanced TT-decompositions and are tested in the classic paradigms of blind source separation from a single mixture, denoising, and feature extraction, achieving superior performance over the widely used truncated algorithms for TT decomposition.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
20201 - Electrical and electronic engineering
Návaznosti výsledku
Projekt
<a href="/cs/project/GA17-00902S" target="_blank" >GA17-00902S: Pokročilé metody slepé separace podprostorů</a><br>
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2020
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
IEEE Transactions on Neural Networks and Learning Systems
ISSN
2162-237X
e-ISSN
—
Svazek periodika
31
Číslo periodika v rámci svazku
11
Stát vydavatele periodika
US - Spojené státy americké
Počet stran výsledku
17
Strana od-do
4622-4636
Kód UT WoS článku
000587699700017
EID výsledku v databázi Scopus
2-s2.0-85093097685