Results 1  10
of
1,269
Blind Beamforming for Non Gaussian Signals
 IEE ProceedingsF
, 1993
"... This paper considers an application of blind identification to beamforming. The key point is to use estimates of directional vectors rather than resorting to their hypothesized value. By using estimates of the directional vectors obtained via blind identification i.e. without knowing the arrray mani ..."
Abstract

Cited by 719 (31 self)
 Add to MetaCart
estimation of directional vectors, based on joint diagonalization of 4thorder cumulant matrices
Ensemble Learning For Independent Component Analysis
, 1999
"... In this paper, a recently developed Bayesian method called ensemble learning is applied to independent component analysis (ICA). Ensemble learning is a computationally efficient approximation for exact Bayesian analysis. In general, the posterior probability density function (pdf) is a complex high ..."
Abstract

Cited by 50 (4 self)
 Add to MetaCart
, the posterior pdf is approximated by a diagonal Gaussian pdf. According to the ICAmodel used in this paper, the measurements are generated by a linear mapping from mutually independent source signals whose distributions are mixtures of Gaussians. The measurements are also assumed to have additive Gaussian
Distance Distribution of the Diagonal Gaussian Graphs
"... The Gaussian integers Z[i] is the subset of the complex numbers C with integer real and imaginary parts, that is: Z[i]: = {x + yi  x, y ∈ Z}. Given any 0 ̸ = α ∈ Z[i] we consider Z[i]α which is the ring of the classes of Z[i] modulo the ideal (α) generated by α. Definition 1 Let 0 ̸ = α ∈ Z[i], the ..."
Abstract
 Add to MetaCart
[i], then the Diagonal Gaussian graph generated by α, G 8 α = (V, E), is defined as:
Coil sensitivity encoding for fast MRI. In:
 Proceedings of the ISMRM 6th Annual Meeting,
, 1998
"... New theoretical and practical concepts are presented for considerably enhancing the performance of magnetic resonance imaging (MRI) by means of arrays of multiple receiver coils. Sensitivity encoding (SENSE) is based on the fact that receiver sensitivity generally has an encoding effect complementa ..."
Abstract

Cited by 193 (3 self)
 Add to MetaCart
n K . Assembling sample and image values in vectors, image reconstruction may be rewritten in matrix notation: With such linear mapping the propagation of noise from sample values into image values is conveniently described by noise matrices. The th diagonal entry of the image noise matrix X
Behavioral theories and the neurophysiology of reward,
 Annu. Rev. Psychol.
, 2006
"... ■ Abstract The functions of rewards are based primarily on their effects on behavior and are less directly governed by the physics and chemistry of input events as in sensory systems. Therefore, the investigation of neural mechanisms underlying reward functions requires behavioral theories that can ..."
Abstract

Cited by 187 (0 self)
 Add to MetaCart
of different food and liquid rewards Reward neurons should distinguish rewards from punishers. Different neurons in orbitofrontal cortex respond to rewarding and aversive liquids The omission of reward following a CS moves the contingency toward the diagonal line in Prediction Error Just as with behavioral
WalkSums and Belief Propagation in Gaussian Graphical Models
 Journal of Machine Learning Research
, 2006
"... We present a new framework based on walks in a graph for analysis and inference in Gaussian graphical models. The key idea is to decompose the correlation between each pair of variables as a sum over all walks between those variables in the graph. The weight of each walk is given by a product of edg ..."
Abstract

Cited by 101 (16 self)
 Add to MetaCart
of edgewise partial correlation coefficients. This representation holds for a large class of Gaussian graphical models which we call walksummable. We give a precise characterization of this class of models, and relate it to other classes including diagonally dominant, attractive, nonfrustrated
Blind Separation of Instantaneous Mixtures of Non Stationary Sources
 IEEE Trans. Signal Processing
, 2000
"... Most ICA algorithms are based on a model of stationary sources. This paper considers exploiting the (possible) nonstationarity of the sources to achieve separation. We introduce two objective functions based on the likelihood and on mutual information in a simple Gaussian non stationary model and w ..."
Abstract

Cited by 167 (12 self)
 Add to MetaCart
and we show how they can be optimized, offline or online, by simple yet remarkably efficient algorithms (one is based on a novel joint diagonalization procedure, the other on a Newtonlike technique). The paper also includes (limited) numerical experiments and a discussion contrasting nonGaussian
GAUSSIAN ELIMINATION IS STABLE FOR THE INVERSE OF A DIAGONALLY DOMINANT MATRIX
"... Abstract. Let B ∈ Mn(C) be a row diagonally dominant matrix, i.e., n� σibii  = bij, i =1,...,n, j=1 j�=i where 0 ≤ σi < 1, i =1,...,n, with σ =max1≤i≤n σi. We show that no pivoting is necessary when Gaussian elimination is applied to A = B −1. Moreover, the growth factor for A does not excee ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. Let B ∈ Mn(C) be a row diagonally dominant matrix, i.e., n� σibii  = bij, i =1,...,n, j=1 j�=i where 0 ≤ σi < 1, i =1,...,n, with σ =max1≤i≤n σi. We show that no pivoting is necessary when Gaussian elimination is applied to A = B −1. Moreover, the growth factor for A does
OFFDIAGONAL BOUNDS OF NONGAUSSIAN TYPE FOR THE DIRICHLET HEAT KERNEL
"... The paper considers the heat kernel KX(t,x, y) of the operator fiD on a proper Euclidean domain X, with Dirichlet boundary conditions. A general pointwise lower bound for KX, which is valid for t larger than a suitable t (x, y), is proved (the shorttime behaviour being well understood). The resulti ..."
Abstract
 Add to MetaCart
). The resulting nonGaussian bounds describe simultaneously both the case of bounded domains and the case, modelled on the halfspace example, of domains which satisfy a twisted infinite internal cone condition. Bounds for the Green’s function are given as well.
The Bucket Box Intersection (BBI) Algorithm For Fast Approximative Evaluation Of Diagonal Mixture Gaussians
 In Proc. ICASSP
, 1996
"... Today, most of the stateoftheart speech recognizers are based on Hidden Markov modeling. Using semicontinuous or continuous density Hidden Markov Models, the computation of emission probabilities requires the evaluation of mixture Gaussian probability density functions. Since it is very expensiv ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
expensive to evaluate all the Gaussians of the mixture density codebook, many recognizers only compute the M most significant Gaussians (M = 1; : : : ; 8). This paper presents an alternative approach to approximate mixture Gaussians with diagonal covariance matrices, based on a binary feature space
Results 1  10
of
1,269