Results 1  10
of
19
Tensor Decompositions and Applications
 SIAM REVIEW
, 2009
"... This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal proce ..."
Abstract

Cited by 705 (17 self)
 Add to MetaCart
(Show Context)
This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, etc. Two particular tensor decompositions can be considered to be higherorder extensions of the matrix singular value decompo
sition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rankone tensors, and the Tucker decomposition is a higherorder form of principal components analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The Nway Toolbox and Tensor Toolbox, both for MATLAB, and the Multilinear Engine are examples of software packages for working with tensors.
Symmetric tensors and symmetric tensor rank
 Scientific Computing and Computational Mathematics (SCCM
, 2006
"... Abstract. A symmetric tensor is a higher order generalization of a symmetric matrix. In this paper, we study various properties of symmetric tensors in relation to a decomposition into a symmetric sum of outer product of vectors. A rank1 orderk tensor is the outer product of k nonzero vectors. An ..."
Abstract

Cited by 101 (22 self)
 Add to MetaCart
Abstract. A symmetric tensor is a higher order generalization of a symmetric matrix. In this paper, we study various properties of symmetric tensors in relation to a decomposition into a symmetric sum of outer product of vectors. A rank1 orderk tensor is the outer product of k nonzero vectors. Any symmetric tensor can be decomposed into a linear combination of rank1 tensors, each of them being symmetric or not. The rank of a symmetric tensor is the minimal number of rank1 tensors that is necessary to reconstruct it. The symmetric rank is obtained when the constituting rank1 tensors are imposed to be themselves symmetric. It is shown that rank and symmetric rank are equal in a number of cases, and that they always exist in an algebraically closed field. We will discuss the notion of the generic symmetric rank, which, due to the work of Alexander and Hirschowitz, is now known for any values of dimension and order. We will also show that the set of symmetric tensors of symmetric rank at most r is not closed, unless r = 1. Key words. Tensors, multiway arrays, outer product decomposition, symmetric outer product decomposition, candecomp, parafac, tensor rank, symmetric rank, symmetric tensor rank, generic symmetric rank, maximal symmetric rank, quantics AMS subject classifications. 15A03, 15A21, 15A72, 15A69, 15A18 1. Introduction. We
Tensor decompositions, state of the art and applications
 MATHEMATICS IN SIGNAL PROCESSING V
"... ..."
Canonical Tensor Decompositions
 ARCC WORKSHOP ON TENSOR DECOMPOSITION
, 2004
"... The Singular Value Decomposition (SVD) may be extended to tensors at least in two very different ways. One is the HighOrder SVD (HOSVD), and the other is the Canonical Decomposition (CanD). Only the latter is closely related to the tensor rank. Important basic questions are raised in this short pap ..."
Abstract

Cited by 42 (16 self)
 Add to MetaCart
The Singular Value Decomposition (SVD) may be extended to tensors at least in two very different ways. One is the HighOrder SVD (HOSVD), and the other is the Canonical Decomposition (CanD). Only the latter is closely related to the tensor rank. Important basic questions are raised in this short paper, such as the maximal achievable rank of a tensor of given dimensions, or the computation of a CanD. Some questions are answered, and it turns out that the answers depend on the choice of the underlying field, and on tensor symmetry structure, which outlines a major difference compared to matrices.
TENSORCUR DECOMPOSITIONS FOR TENSORBASED DATA
 SIAM J. MATRIX ANAL. APPL.
, 2008
"... Motivated by numerous applications in which the data may be modeled by a variable subscripted by three or more indices, we develop a tensorbased extension of the matrix CUR decomposition. The tensorCUR decomposition is most relevant as a data analysis tool when the data consist of one mode that i ..."
Abstract

Cited by 36 (11 self)
 Add to MetaCart
Motivated by numerous applications in which the data may be modeled by a variable subscripted by three or more indices, we develop a tensorbased extension of the matrix CUR decomposition. The tensorCUR decomposition is most relevant as a data analysis tool when the data consist of one mode that is qualitatively different from the others. In this case, the tensorCUR decomposition approximately expresses the original data tensor in terms of a basis consisting of underlying subtensors that are actual data elements and thus that have a natural interpretation in terms of the processes generating the data. Assume the data may be modeled as a (2+1)tensor, i.e., an m×n×p tensor A in which the first two modes are similar and the third is qualitatively different. We refer to each of the p different m × n matrices as “slabs ” and each of the mn different pvectors as “fibers.” In this case, the tensorCUR algorithm computes an approximation to the data tensor A that is of the form CUR, where C is an m×n×c tensor consisting of a small number c of the slabs, R is an r × p matrix consisting of a small number r of the fibers, and U is an appropriately defined and easily computed c × r encoding matrix. Both C and R may be chosen by randomly sampling either slabs or fibers according to a judiciously chosen and datadependent probability distribution, and both c and r depend on a rank parameter k, an error parameter ɛ, and a failure probability δ. Under
A randomized algorithm for a tensorbased generalization of the singular value decomposition
, 2007
"... ..."
Generic and typical ranks of multiway arrays
 Linear Algebra Appl
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract

Cited by 27 (5 self)
 Add to MetaCart
(Show Context)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.
Tensors: a Brief Introduction
, 2014
"... Tensor decompositions are at the core of many Blind Source Separation (BSS) algorithms, either explicitly or implicitly. In particular, the Canonical Polyadic (CP) tensor ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Tensor decompositions are at the core of many Blind Source Separation (BSS) algorithms, either explicitly or implicitly. In particular, the Canonical Polyadic (CP) tensor
Genericity and Rank Deficiency of High Order Symmetric Tensors
 Proc. IEEE Int. Conference on Acoustics, Speech, and Signal Processing (ICASSP
, 2006
"... Blind Identification of UnderDetermined Mixtures (UDM) is involved in numerous applications, including MultiWay factor Analysis (MWA) and Signal Processing. In the latter case, the use of HighOrder Statistics (HOS) like Cumulants leads to the decomposition of symmetric tensors. Yet, little has be ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
Blind Identification of UnderDetermined Mixtures (UDM) is involved in numerous applications, including MultiWay factor Analysis (MWA) and Signal Processing. In the latter case, the use of HighOrder Statistics (HOS) like Cumulants leads to the decomposition of symmetric tensors. Yet, little has been published about rankrevealing decompositions of symmetric tensors. Definitions of rank are discussed, and useful results on Generic Rank are proved, with the help of tools borrowed from Algebraic Geometry. 1.
TENSORS VERSUS MATRICES USEFULNESS AND UNEXPECTED PROPERTIES
 IEEE WORKSHOP ON STATISTICAL SIGNAL PROCESSING, CARDIFF: UNITED KINGDOM (2009)
, 2009
"... Since the nineties, tensors are increasingly used in Signal Processing and Data Analysis. There exist striking differences between tensors and matrices, some being advantages, and others raising difficulties. These differences are pointed out in this paper while briefly surveying the state of the ar ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Since the nineties, tensors are increasingly used in Signal Processing and Data Analysis. There exist striking differences between tensors and matrices, some being advantages, and others raising difficulties. These differences are pointed out in this paper while briefly surveying the state of the art. The conclusion is that tensors are omnipresent in real life, implicitly or explicitly, and must be used even if we still know quite little about their properties.