Results 1 -
6 of
6
Scalable tensor decompositions for multi-aspect data mining
- In ICDM 2008: Proceedings of the 8th IEEE International Conference on Data Mining
, 2008
"... Modern applications such as Internet traffic, telecommunication records, and large-scale social networks generate massive amounts of data with multiple aspects and high dimensionalities. Tensors (i.e., multi-way arrays) provide a natural representation for such data. Consequently, tensor decompositi ..."
Abstract
-
Cited by 64 (2 self)
- Add to MetaCart
overflows occur during the tensor factorization process. To address this intermediate blowup problem, we propose Memory-Efficient Tucker (MET). Based on the available memory, MET adaptively selects the right execution strategy during the decomposition. We provide quantitative and qualitative evaluation
EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT
"... series 360/370 computer system (or compatible systems such as Amdol or Honeywell) that provides at least 1536-K of core storage. This large amount of core storage is usually provided via a VS-1 or VS-2 virtual memory system. This program provides a factor analy-tic solution for large 3-dimensional d ..."
Abstract
- Add to MetaCart
of Tucker (1966). This method provides most efficient analysis when the i mode, usually individuals, is quite large, though this size requirement is certainly not a necessary condition. The mathematical notation used in this paper is that of Tucker (1966). The Theoretical Model In this paper, i, j and, k
J Math Imaging Vis DOI 10.1007/s10851-014-0497-0 Variational Image Registration Using Inhomogeneous
, 2012
"... Abstract We present a generalization of the convolution-based variational image registration approach, in which dif-ferent regularizers can be implemented by conveniently exchanging the convolution kernel, even if it is nonsepa-rable or nonstationary. Nonseparable kernels pose a chal-lenge because t ..."
Abstract
- Add to MetaCart
point in the image. We propose to pre-compute the local kernels and efficiently store them in memory using the Tucker tensor decomposition model. In our experiments we use the nonseparable exponential kernel and a nonstationary landmark kernel. The exponential kernel replicates desirable properties
1 Computing Sparse Representations of Multidimensional Signals Using Kronecker Bases
"... Recently, there is a great interest in sparse representations of signals under the assumption that signals (datasets) can be well approximated by a linear combination of few elements of a known basis (dictionary). Many algorithms have been developed to find such kind of representations for the case ..."
Abstract
- Add to MetaCart
. We also introduce the concept of multiway block-sparse representation of N-way arrays and develop a new greedy algorithm that exploits not only the Kronecker structure but also block-sparsity. This allows us to derive a very fast and memory efficient algorithm called N-BOMP (N-way Block OMP).
Volume I: Computer Science and Software Engineering
, 2013
"... Algebraic algorithms deal with numbers, vectors, matrices, polynomials, for-mal power series, exponential and differential polynomials, rational functions, algebraic sets, curves and surfaces. In this vast area, manipulation with matri-ces and polynomials is fundamental for modern computations in Sc ..."
Abstract
- Add to MetaCart
Algebraic algorithms deal with numbers, vectors, matrices, polynomials, for-mal power series, exponential and differential polynomials, rational functions, algebraic sets, curves and surfaces. In this vast area, manipulation with matri-ces and polynomials is fundamental for modern computations in Sciences and