Low-rank tensor decompositions
著者
書誌事項
Low-rank tensor decompositions
(Foundations and trends in machine learning, 9:4-5 . Tensor networks for dimensionality reduction and large-scale optimization ; part 1)
now Publishers
大学図書館所蔵 件 / 全1件
-
該当する所蔵館はありません
- すべての絞り込み条件を解除する
注記
Includes bibliographical references (p. 165-184)
Other authors: Namgil Lee, Ivan Oseledets, Anh-Huy Phan, Qibin Zhao and Danilo P. Mandic
内容説明・目次
内容説明
Modern applications in engineering and data science are increasingly based on multidimensional data of exceedingly high volume, variety, and structural richness. However, standard machine learning and data mining algorithms typically scale exponentially with data volume and complexity of cross-modal couplings - the so called curse of dimensionality - which is prohibitive to the analysis of such large-scale, multi-modal and multi-relational datasets. Given that such data are often conveniently represented as multiway arrays or tensors, it is therefore timely and valuable for the multidisciplinary machine learning and data analytic communities to review tensor decompositions and tensor networks as emerging tools for dimensionality reduction and large scale optimization.
This monograph provides a systematic and example-rich guide to the basic properties and applications of tensor network methodologies, and demonstrates their promise as a tool for the analysis of extreme-scale multidimensional data. It demonstrates the ability of tensor networks to provide linearly or even super-linearly, scalable solutions.
The low-rank tensor network framework of analysis presented in this monograph is intended to both help demystify tensor decompositions for educational purposes and further empower practitioners with enhanced intuition and freedom in algorithmic design for the manifold applications. In addition, the material may be useful in lecture courses on large-scale machine learning and big data analytics, or indeed, as interesting reading for the intellectually curious and generally knowledgeable reader.
目次
1: Introduction and Motivation
2: Tensor Operations and Tensor Network Diagrams
3: Constrained Tensor Decompositions: From Two-way to Multiway Component Analysis
4: Tensor Train Decompositions: Graphical Interpretations and Algorithms
5: Discussion and Conclusions
References
「Nielsen BookData」 より