Results 1  10
of
10
Decoupling multivariate polynomials using firstorder information and tensor decompositions
 SIAM J. Matrix Anal. Appl
, 2015
"... Abstract. We present a method to decompose a set of multivariate real polynomials into linear combinations of univariate polynomials in linear forms of the input variables. The method proceeds by collecting the firstorder information of the polynomials in a set of sampling points, which is captured ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Abstract. We present a method to decompose a set of multivariate real polynomials into linear combinations of univariate polynomials in linear forms of the input variables. The method proceeds by collecting the firstorder information of the polynomials in a set of sampling points, which is captured by the Jacobian matrix evaluated at the sampling points. The canonical polyadic decomposition of the threeway tensor of Jacobian matrices directly returns the unknown linear relations as well as the necessary information to reconstruct the univariate polynomials. The conditions under which this decoupling procedure works are discussed, and the method is illustrated on several numerical examples.
Parallel Algorithms for Constrained Tensor Factorization via Alternating Direction Method of Multipliers
, 2014
"... Abstract—Tensor factorization has proven useful in a wide range of applications, from sensor array processing to communications, speech and audio signal processing, and machine learning. With few recent exceptions, all tensor factorization algorithms were originally developed for centralized, inme ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Tensor factorization has proven useful in a wide range of applications, from sensor array processing to communications, speech and audio signal processing, and machine learning. With few recent exceptions, all tensor factorization algorithms were originally developed for centralized, inmemory computation on a single machine; and the few that break away from this mold do not easily incorporate practically important constraints, such as nonnegativity. A new constrained tensor factorization framework is proposed in this paper, building upon the Alternating Direction Method of Multipliers (ADMoM). It is shown that this simplifies computations, bypassing the need to solve constrained optimization problems in each iteration; and it naturally leads to distributed algorithms suitable for parallel implementation. This opens the door for many emerging big dataenabled applications. The methodology is exemplified using nonnegativity as a baseline constraint, but the proposed framework can incorporate many other types of constraints. Numerical experiments are encouraging, indicating that ADMoMbased nonnegative tensor factorization (NTF) has high potential as an alternative to stateoftheart approaches. Index Terms—Tensor decomposition, PARAFACmodel, parallel algorithms.
Generic uniqueness conditions for the canonical polyadic decomposition and INDSCAL
, 2014
"... ar ..."
(Show Context)
BlockDecoupling Multivariate Polynomials Using the Tensor BlockTerm Decomposition
"... Abstract. We present a tensorbased method to decompose a given set of multivariate functions into linear combinations of a set of multivariate functions of linear forms of the input variables. The method proceeds by forming a threeway array (tensor) by stacking Jacobian matrix evaluations of the ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We present a tensorbased method to decompose a given set of multivariate functions into linear combinations of a set of multivariate functions of linear forms of the input variables. The method proceeds by forming a threeway array (tensor) by stacking Jacobian matrix evaluations of the function behind each other. It is shown that a blockterm decomposition of this tensor provides the necessary information to blockdecouple the given function into a set of functions with small inputoutput dimensionality. The method is validated on a numerical example.
A higherorder LQ decomposition for separable covariance models
, 2014
"... We develop a higherorder generalization of the LQ decomposition and show that this decomposition plays an important role in likelihoodbased estimation and testing for separable, or Kronecker structured, covariance models, such as the multilinear normal model. This role is analogous to that of the ..."
Abstract
 Add to MetaCart
We develop a higherorder generalization of the LQ decomposition and show that this decomposition plays an important role in likelihoodbased estimation and testing for separable, or Kronecker structured, covariance models, such as the multilinear normal model. This role is analogous to that of the LQ decomposition in likelihood inference for the multivariate normal model. Additionally, this higherorder LQ decomposition can be used to construct an alternative version of the popular higherorder singular value decomposition for tensorvalued data. We also develop a novel generalization of the polar decomposition to tensorvalued data.
Adaptive Higherorder Spectral Estimators
, 2015
"... Many applications involve estimation of a signal matrix from a noisy data matrix. In such cases, it has been observed that estimators that shrink or truncate the singular values of the data matrix perform well when the signal matrix has approximately low rank. In this article, we generalize this app ..."
Abstract
 Add to MetaCart
Many applications involve estimation of a signal matrix from a noisy data matrix. In such cases, it has been observed that estimators that shrink or truncate the singular values of the data matrix perform well when the signal matrix has approximately low rank. In this article, we generalize this approach to the estimation of a tensor of parameters from noisy tensor data. We develop new classes of estimators that shrink or threshold the modespecific singular values from the higherorder singular value decomposition. These classes of estimators are indexed by tuning parameters, which we adaptively choose from the data by minimizing Stein’s unbiased risk estimate. In particular, this procedure provides a way to estimate the multilinear rank of the underlying signal tensor. Using simulation studies under a variety of conditions, we show that our estimators perform well when the mean tensor has approximately low multilinear rank, and perform competitively when the signal tensor does not have approximately low multilinear rank. We illustrate the use of these methods in an application to multivariate relational data.
(article begins on next page) Breaking the Curse of Dimensionality using Decompositions of Incomplete Tensors
"... scientific computing in big data analysis ..."
(Show Context)
Tensors and latent variable models
"... Abstract. In this paper we discuss existing and new connections between latent variable models from machine learning and tensors (multiway arrays) from multilinear algebra. A few ideas have been developed independently in the two communities. However, there are still many useful but unexplored lin ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. In this paper we discuss existing and new connections between latent variable models from machine learning and tensors (multiway arrays) from multilinear algebra. A few ideas have been developed independently in the two communities. However, there are still many useful but unexplored links and ideas that could be borrowed from one of the communities and used in the other. We will start our discussion from simple concepts such as independent variables and rank1 matrices and gradually increase the difficulty. The final goal is to connect discrete latent tree graphical models to state of the art tensor decompositions in order to find tractable representations of probability tables of many variables.
Processing Hyperspectral Images using NonLinear Least Square Algorithm as an Optimization Method for Tensor Decomposition Model
"... Due to large size and huge availability of unwanted or missing information in hyperspectral image, development of data effective compression and denoising methods is of prior importance. Compression removes unmeaningful information and thereby reducing data which ultimately leads to noise free image ..."
Abstract
 Add to MetaCart
(Show Context)
Due to large size and huge availability of unwanted or missing information in hyperspectral image, development of data effective compression and denoising methods is of prior importance. Compression removes unmeaningful information and thereby reducing data which ultimately leads to noise free image. This study deals with execution of two lossless decomposition methods Low Multilinear Rank Approximation, four types of Block Term Decomposition to the input image cube to make it noise free using nonlinear least square method as an optimization method and their performance were assessed. BTD (Lr, Lr, 1) was selected as the best tensor algorithm based on residual error and frobenius norm value with a limitation that the image cube to be processed by the method should have good spatial resolution.