Results 1  10
of
76
Tensor Decompositions and Applications
 SIAM REVIEW
, 2009
"... This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal proce ..."
Abstract

Cited by 723 (18 self)
 Add to MetaCart
(Show Context)
This survey provides an overview of higherorder tensor decompositions, their applications, and available software. A tensor is a multidimensional or N way array. Decompositions of higherorder tensors (i.e., N way arrays with N â¥ 3) have applications in psychometrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, etc. Two particular tensor decompositions can be considered to be higherorder extensions of the matrix singular value decompo
sition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rankone tensors, and the Tucker decomposition is a higherorder form of principal components analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The Nway Toolbox and Tensor Toolbox, both for MATLAB, and the Multilinear Engine are examples of software packages for working with tensors.
A multilinear singular value decomposition
 SIAM J. Matrix Anal. Appl
, 2000
"... Abstract. We discuss a multilinear generalization of the singular value decomposition. There is a strong analogy between several properties of the matrix and the higherorder tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, firstorder perturbation effects, etc., are ..."
Abstract

Cited by 472 (22 self)
 Add to MetaCart
(Show Context)
Abstract. We discuss a multilinear generalization of the singular value decomposition. There is a strong analogy between several properties of the matrix and the higherorder tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, firstorder perturbation effects, etc., are analyzed. We investigate how tensor symmetries affect the decomposition and propose a multilinear generalization of the symmetric eigenvalue decomposition for pairwise symmetric tensors.
Principal component analysis of threemode data by means of alternating least squares algorithms
 Springer Complete Collection
, 1980
"... A new method to estimate the parameters of Tucker's threemode principal component model is discussed, and the convergence properties of the alternating least squares algorithm to solve the estimation problem are considered. A special case ofthe general Tucker model, in which the principal comp ..."
Abstract

Cited by 133 (5 self)
 Add to MetaCart
A new method to estimate the parameters of Tucker's threemode principal component model is discussed, and the convergence properties of the alternating least squares algorithm to solve the estimation problem are considered. A special case ofthe general Tucker model, in which the principal component analysis is only performed over two of the three modes is briefly outlined as well. The Miller & Nicely data on the confusion of English consonants are used to illustrate the programs TUCKALS3 and TUCKALS2 which incorporate the algorithms for the two models described. Key words: threemode principal component analysis, alternating least squares, factor analysis, multidimensional scaling, individual differences scaling, simultaneous iteration, confusion of consonants. 1. ThreeMode Models and Their Solutions The threemode modelhere r ferred to as the Tucker3 modelwas first formulated by Tucker [1963], and subsequently extended in articles by Tucker [1964, 1966], and Levin [1963, Note 5] especially with respect to the mathematical description and program
Unsupervised multiway data analysis: A literature survey
 IEEE Transactions on Knowledge and Data Engineering
, 2008
"... Multiway data analysis captures multilinear structures in higherorder datasets, where data have more than two modes. Standard twoway methods commonly applied on matrices often fail to find the underlying structures in multiway arrays. With increasing number of application areas, multiway data anal ..."
Abstract

Cited by 82 (10 self)
 Add to MetaCart
(Show Context)
Multiway data analysis captures multilinear structures in higherorder datasets, where data have more than two modes. Standard twoway methods commonly applied on matrices often fail to find the underlying structures in multiway arrays. With increasing number of application areas, multiway data analysis has become popular as an exploratory analysis tool. We provide a review of significant contributions in literature on multiway models, algorithms as well as their applications in diverse disciplines including chemometrics, neuroscience, computer vision, and social network analysis. 1.
Confirmatory factor analyses of multitraitmultimethod data: Many problems and a few solutions
 Applied Psychological Measurement
, 1989
"... Alternative models for confirmatory factor analysis of multitraitmultimethod (MTMM) data were evaluated by varying the number of traits and methods and sample size for 255 MTMM matrices constructed from real data (Study 1), and for 180 MTMM matrices constructed from simulated data (Study 2). The co ..."
Abstract

Cited by 61 (1 self)
 Add to MetaCart
Alternative models for confirmatory factor analysis of multitraitmultimethod (MTMM) data were evaluated by varying the number of traits and methods and sample size for 255 MTMM matrices constructed from real data (Study 1), and for 180 MTMM matrices constructed from simulated data (Study 2). The correlated uniqueness model converged to proper solutions for 99 % (Study 1) and 96 % (Study 2) of the MTMM matrices, whereas the general model typically used converged to proper solutions for only 24 % (Study 1) and 22% (Study 2) of the MTMM matrices. The general model was usually illdefined (100 % in Study 1, 90 % in Study 2) for small MTMM matrices with small Ns, but performed better when the size of
Dimensionality reduction in higherorder signal processing and rank(R_1,R__2,...,R_N) reduction in multilinear algebra
, 2004
"... ..."
Generic and typical ranks of multiway arrays
 Linear Algebra Appl
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
(Show Context)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.
Handwritten Digit Classification using Higher Order Singular Value Decomposition
"... In this paper we present two algorithms for handwritten digit classification based on the higher order singular value decomposition (HOSVD). The first algorithm uses HOSVD for construction of the class models and achieves classification results with error rate lower than 6%. The second algorithm use ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
In this paper we present two algorithms for handwritten digit classification based on the higher order singular value decomposition (HOSVD). The first algorithm uses HOSVD for construction of the class models and achieves classification results with error rate lower than 6%. The second algorithm uses the HOSVD for tensor approximation simultaneously in two modes. Classification results for the second algorithm are almost down at 5 % even though the approximation reduces the original training data with more than 98 % before the construction of the class models. The actual classification in the test phase for both algorithms is conducted by solving a series least squares problems. Considering computational amount for the test presented the second algorithm is twice as efficient as the first one.
OptimizationBased Algorithms for Tensor DECOMPOSITIONS: CANONICAL POLYADIC DECOMPOSITION, DECOMPOSITION IN RANK(Lr, Lr, 1) TERMS, AND A NEW GENERALIZATION
, 2013
"... The canonical polyadic and rank(Lr, Lr, 1) block term decomposition (CPD and BTD, respectively) are two closely related tensor decompositions. The CPD and, recently, BTD are important tools in psychometrics, chemometrics, neuroscience, and signal processing. We present a decomposition that genera ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
The canonical polyadic and rank(Lr, Lr, 1) block term decomposition (CPD and BTD, respectively) are two closely related tensor decompositions. The CPD and, recently, BTD are important tools in psychometrics, chemometrics, neuroscience, and signal processing. We present a decomposition that generalizes these two and develop algorithms for its computation. Among these algorithms are alternating least squares schemes, several general unconstrained optimization techniques, and matrixfree nonlinear least squares methods. In the latter we exploit the structure of the Jacobian’s Gramian to reduce computational and memory cost. Combined with an effective preconditioner, numerical experiments confirm that these methods are among the most efficient and robust currently available for computing the CPD, rank(Lr, Lr, 1) BTD, and their generalized decomposition.
Separable covariance arrays via the Tucker product, with applications to multivariate relational data
, 2010
"... Modern datasets are often in the form of matrices or arrays, potentially having correlations along each set of data indices. For example, data involving repeated measurements of several variables over time may exhibit temporal correlation as well as correlation among the variables. A possible model ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
(Show Context)
Modern datasets are often in the form of matrices or arrays, potentially having correlations along each set of data indices. For example, data involving repeated measurements of several variables over time may exhibit temporal correlation as well as correlation among the variables. A possible model for matrixvalued data is the class of matrix normal distributions, which is parametrized by two covariance matrices, one for each index set of the data. In this article we describe an extension of the matrix normal model to accommodate multidimensional data arrays, or tensors. We generate a class of array normal distributions by applying a group of multilinear transformations to an array of independent standard normal random variables. The covariance structures of the resulting class take the form of outer products of dimensionspecific covariance matrices. We derive some properties of these covariance structures and the corresponding array normal distributions, discuss maximum likelihood and Bayesian estimation of covariance parameters and illustrate the model in an analysis of multivariate longitudinal network data. Some key words: Gaussian, matrix normal, multiway data, network, tensor, Tucker decomposition. 1