Low-rank tensor decompositions
Author(s)
Bibliographic Information
Low-rank tensor decompositions
(Foundations and trends in machine learning, 9:4-5 . Tensor networks for dimensionality reduction and large-scale optimization ; part 1)
now Publishers
Available at / 1 libraries
-
No Libraries matched.
- Remove all filters.
Note
Includes bibliographical references (p. 165-184)
Other authors: Namgil Lee, Ivan Oseledets, Anh-Huy Phan, Qibin Zhao and Danilo P. Mandic
Description and Table of Contents
Description
Modern applications in engineering and data science are increasingly based on multidimensional data of exceedingly high volume, variety, and structural richness. However, standard machine learning and data mining algorithms typically scale exponentially with data volume and complexity of cross-modal couplings - the so called curse of dimensionality - which is prohibitive to the analysis of such large-scale, multi-modal and multi-relational datasets. Given that such data are often conveniently represented as multiway arrays or tensors, it is therefore timely and valuable for the multidisciplinary machine learning and data analytic communities to review tensor decompositions and tensor networks as emerging tools for dimensionality reduction and large scale optimization.
This monograph provides a systematic and example-rich guide to the basic properties and applications of tensor network methodologies, and demonstrates their promise as a tool for the analysis of extreme-scale multidimensional data. It demonstrates the ability of tensor networks to provide linearly or even super-linearly, scalable solutions.
The low-rank tensor network framework of analysis presented in this monograph is intended to both help demystify tensor decompositions for educational purposes and further empower practitioners with enhanced intuition and freedom in algorithmic design for the manifold applications. In addition, the material may be useful in lecture courses on large-scale machine learning and big data analytics, or indeed, as interesting reading for the intellectually curious and generally knowledgeable reader.
Table of Contents
1: Introduction and Motivation
2: Tensor Operations and Tensor Network Diagrams
3: Constrained Tensor Decompositions: From Two-way to Multiway Component Analysis
4: Tensor Train Decompositions: Graphical Interpretations and Algorithms
5: Discussion and Conclusions
References
by "Nielsen BookData"