• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 11 - 20 of 1,602
Next 10 →

On the monoidal structure of matrix bi-factorisations

by Nils Carqueville, Ingo Runkel , 2009
"... We investigate tensor products of matrix factorisations. This is most naturally done by formulating matrix factorisations in terms of bimodules instead of modules. If the underlying ring is�[x1,...,xN] we show that bimodule matrix factorisations form a monoidal category. This monoidal category has a ..."
Abstract - Cited by 14 (5 self) - Add to MetaCart
We investigate tensor products of matrix factorisations. This is most naturally done by formulating matrix factorisations in terms of bimodules instead of modules. If the underlying ring is�[x1,...,xN] we show that bimodule matrix factorisations form a monoidal category. This monoidal category has

SCORE GUIDED AUDIO RESTORATION VIA GENERALISED COUPLED TENSOR FACTORISATION

by Y. Kenan Yılmaz, A. Taylan Cemgil
"... Generalised coupled tensor factorisation is a recently proposed algorithmic framework for simultaneously estimating tensor factorisation models where several observed tensors can share a set of latent factors. This paper proposes a model in this framework for coupled factorisation of piano spectrogr ..."
Abstract - Cited by 5 (1 self) - Add to MetaCart
Generalised coupled tensor factorisation is a recently proposed algorithmic framework for simultaneously estimating tensor factorisation models where several observed tensors can share a set of latent factors. This paper proposes a model in this framework for coupled factorisation of piano

USING TENSOR FACTORISATION MODELS TO SEPARATE DRUMS FROM POLYPHONIC MUSIC

by Derry Fitzgerald, Eugene Coyle, Matt Cranitch
"... This paper describes the use of Non-negative Tensor Factorisation models for the separation of drums from polyphonic audio. Im-proved separation of the drums is achieved through the incorpo-ration of Gamma Chain priors into the Non-negative Tensor Fac-torisation framework. In contrast to many previo ..."
Abstract - Cited by 5 (1 self) - Add to MetaCart
This paper describes the use of Non-negative Tensor Factorisation models for the separation of drums from polyphonic audio. Im-proved separation of the drums is achieved through the incorpo-ration of Gamma Chain priors into the Non-negative Tensor Fac-torisation framework. In contrast to many

Musical Source Separation using Generalised Non-Negative Tensor Factorisation models

by Derry Fitzgerald, Matt Cranitch
"... A shift-invariant non-negative tensor factorisation algorithm for musical source separation is proposed which generalises previous work by allowing each source to have its own parameters rather a fixed set of parameters for all sources. This allows independent control of the number of allowable note ..."
Abstract - Cited by 2 (2 self) - Add to MetaCart
A shift-invariant non-negative tensor factorisation algorithm for musical source separation is proposed which generalises previous work by allowing each source to have its own parameters rather a fixed set of parameters for all sources. This allows independent control of the number of allowable

Extended Non-negative Tensor Factorisation models for Musical Sound Source Separation

by unknown authors
"... Recently, shift invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when attemp ..."
Abstract - Add to MetaCart
Recently, shift invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when

Unique factorisation of additive inducedhereditary properties

by Alastair Farrugia, R. Bruce Richter
"... An additive hereditary graph property is a set of graphs, closed under isomorphism and under taking subgraphs and disjoint unions. Let P1,...,Pn be additive hereditary graph properties. A graph G has property (P1 ◦ · · · ◦ Pn) if there is a partition (V1,...,Vn) of V (G) into n sets such that, f ..."
Abstract - Cited by 6 (5 self) - Add to MetaCart
, for all i, the induced subgraph G[Vi] is in Pi. A property P is reducible if there are properties Q, R such that P = Q ◦ R; otherwise it is irreducible. Mihók, Semaniˇsin and Vasky [J. Graph Theory 33 (2000), 44–53] gave a factorisation for any additive hereditary property P into a given number dc

Matrix Wiener-Hopf factorisation II by

by A. D. Rawlins , 1984
"... A direct method is described for effecting the explicit Wiener-Hopf factorisation of a class of (2 x 2}—matrices. The class is determined such that the factorisation problem can be reduced to a matrix Hilbert problem which involves an upper or lower triangular matrix. Then the matrix Hilbert problem ..."
Abstract - Add to MetaCart
A direct method is described for effecting the explicit Wiener-Hopf factorisation of a class of (2 x 2}—matrices. The class is determined such that the factorisation problem can be reduced to a matrix Hilbert problem which involves an upper or lower triangular matrix. Then the matrix Hilbert

General tensor discriminant analysis and Gabor featuresforgaitrecognition,”IEEE Trans

by Dacheng Tao, Xuelong Li, Xindong Wu, Stephen J. Maybank - Pattern Anal. Mach. Intell , 2007
"... Abstract — The traditional image representations are not suited to conventional classification methods, such as the linear discriminant analysis (LDA), because of the under sample problem (USP): the dimensionality of the feature space is much higher than the number of training samples. Motivated by ..."
Abstract - Cited by 105 (11 self) - Add to MetaCart
by the successes of the two dimensional LDA (2DLDA) for face recognition, we develop a general tensor discriminant analysis (GTDA) as a preprocessing step for LDA. The benefits of GTDA compared with existing preprocessing methods, e.g., principal component analysis (PCA) and 2DLDA, include 1) the USP is reduced

Tensor decompositions for learning latent variable models

by Animashree Anandkumar, Rong Ge, Daniel Hsu, Sham M. Kakade, Matus Telgarsky , 2014
"... This work considers a computationally and statistically efficient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their low-order observable mo ..."
Abstract - Cited by 83 (7 self) - Add to MetaCart
moments (typically, of second- and third-order). Specifically, parameter estimation is reduced to the problem of extracting a certain (orthog-onal) decomposition of a symmetric tensor derived from the moments; this decomposition can be viewed as a natural generalization of the singular value decomposition

Discriminant analysis with tensor representation

by Shuicheng Yan, Dong Xu, Qiang Yang, Lei Zhang, Xiaoou Tang, Hong-jiang Zhang - in Proc. IEEE Conf. Comput. Vision Pattern Recognit., 2005 , 2005
"... In this paper, we present a novel approach to solving the supervised dimensionality reduction problem by encoding an image object as a general tensor of 2nd or higher order. First, we propose a Discriminant Tensor Criterion (DTC), whereby multiple interrelated lower-dimensional discriminative subspa ..."
Abstract - Cited by 53 (13 self) - Add to MetaCart
; and 3) the computational cost in the learning stage is reduced to a large extent owing to the reduced data dimensions in generalized eigenvalue decomposition. We provide extensive experiments by encoding face images as 2nd or 3rd order tensors to demonstrate that the proposed DATER algorithm based
Next 10 →
Results 11 - 20 of 1,602
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University