Results 1  10
of
221
Scalable tensor decompositions for multiaspect data mining
 In ICDM 2008: Proceedings of the 8th IEEE International Conference on Data Mining
, 2008
"... Modern applications such as Internet traffic, telecommunication records, and largescale social networks generate massive amounts of data with multiple aspects and high dimensionalities. Tensors (i.e., multiway arrays) provide a natural representation for such data. Consequently, tensor decompositi ..."
Abstract

Cited by 64 (2 self)
 Add to MetaCart
the Tucker decompositions for sparse tensors because standard algorithms do not account for the sparsity of the data. As a result, a surprising phenomenon is observed by practitioners: Despite the fact that there is enough memory to store both the input tensors and the factorized output tensors, memory
Combining effects: sum and tensor
"... We seek a unified account of modularity for computational effects. We begin by reformulating Moggi’s monadic paradigm for modelling computational effects using the notion of enriched Lawvere theory, together with its relationship with strong monads; this emphasises the importance of the operations ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
that produce the effects. Effects qua theories are then combined by appropriate bifunctors on the category of theories. We give a theory for the sum of computational effects, which in particular yields Moggi’s exceptions monad transformer and an interactive input/output monad transformer. We further give a
Hierarchical tensor approximation of output quantities of parameterdependent PDEs
, 2014
"... certainty quantification or optimisation. In many cases, one is interested in scalar output quantities induced by the parameterdependent solution. The output can be interpreted as a tensor living on a highdimensional parameter space. Our aim is to adaptively construct an approximation of this tens ..."
Abstract
 Add to MetaCart
certainty quantification or optimisation. In many cases, one is interested in scalar output quantities induced by the parameterdependent solution. The output can be interpreted as a tensor living on a highdimensional parameter space. Our aim is to adaptively construct an approximation
Bilateral Filtering of Diffusion Tensor Magnetic Resonance Images
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 2007
"... Abstract—We extend the wellknown scalar image bilateral filtering technique to diffusion tensor magnetic resonance images (DTMRI). The scalar version of bilateral image filtering is extended to perform edgepreserving smoothing of DT field data. The bilateral DT filtering is performed in the LogEu ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Euclidean framework which guarantees valid output tensors. Smoothing is achieved by weighted averaging of neighboring tensors. Analogous to bilateral filtering of scalar images, the weights are chosen to be inversely proportional to two distance measures: The geometrical Euclidean distance between the spatial
Linear algebra for tensor problems
 Computing
"... Abstract. By a tensor problem in general, we mean one where all the data on input and output are given (exactly or approximately) in tensor formats, the number of data representation parameters being much smaller than the total amount of data. For such problems, it is natural to seek for algorithms ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract. By a tensor problem in general, we mean one where all the data on input and output are given (exactly or approximately) in tensor formats, the number of data representation parameters being much smaller than the total amount of data. For such problems, it is natural to seek
Robustness of Tensor Product
"... A rank 3 tensor product network has 3 dimensions of size p, q, and r, say. Its processing units include pqr Networks Using Distributed binding units, and input/output units grouped into 3 Representations vectors of length p, q and r. Typically one of the I/O ..."
Abstract
 Add to MetaCart
A rank 3 tensor product network has 3 dimensions of size p, q, and r, say. Its processing units include pqr Networks Using Distributed binding units, and input/output units grouped into 3 Representations vectors of length p, q and r. Typically one of the I/O
C. Calculating tensor components
, 1999
"... Contents 1 Tensor notation C2 2 Calculating tensor components C6 3 Displaying tensor components C9 4 Modifying tensor components C12 5 Simplification strategies C15 6 Accessing tensor components C16 7 Standard object library C17 Queen's University at Kingston, Ontario C. Calculating tensor c ..."
Abstract
 Add to MetaCart
of tensors and expressing the output in a convenient form. The main method by which calculation is accomplished in GRTensorII is through the grcalc() command. Simple calculation is often not the last step, however, in producing interpretable output. Often large terms result in individual tensor components
Bilateral Filtering of Diffusion Tensor Magnetic Resonance Images
"... Abstract—We extend the wellknown scalar image bilateral filtering technique to diffusion tensor magnetic resonance images (DTMRI). The scalar version of bilateral image filtering is extended to perform edgepreserving smoothing of DT field data. The bilateral DT filtering is performed in the LogEu ..."
Abstract
 Add to MetaCart
Euclidean framework which guarantees valid output tensors. Smoothing is achieved by weighted averaging of neighboring tensors. Analogous to bilateral filtering of scalar images, the weights are chosen to be inversely proportional to two distance measures: The geometrical Euclidean distance between the spatial
Investigations of Tensor Voting Modeling
"... Tensor voting (TV) is a method for inferring geometric structures from sparse, irregular and possibly noisy input. It was initially proposed by Guy and Medioni [Guy96] and has been applied to several computer vision applications. TV generates a dense output field in a domain by dispersing informatio ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Tensor voting (TV) is a method for inferring geometric structures from sparse, irregular and possibly noisy input. It was initially proposed by Guy and Medioni [Guy96] and has been applied to several computer vision applications. TV generates a dense output field in a domain by dispersing
Robustness of Tensor Product
"... This paper describes experiments on on the robustness of tensor product networks using distributed representations, for recall tasks. The results of the experiments indicate, among other things, that the degree of robustness increases with the number of binding units and decreases with the fraction ..."
Abstract
 Add to MetaCart
experiments on the robustness of tensor product network of ranks between 2 and 7. In the experiments, varying numbers of randomly selected nodes in the network were "killed" by changing them so that they always produced zero output, and then the Figure 1: Network connectivity for a 3x3x3 tensor
Results 1  10
of
221