Results

**11 - 13**of**13**### 1Group Factor Analysis

"... Abstract—Factor analysis provides linear factors that describe relation-ships between individual variables of a data set. We extend this clas-sical formulation into linear factors that describe relationships between groups of variables, where each group represents either a set of related variables o ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract—Factor analysis provides linear factors that describe relation-ships between individual variables of a data set. We extend this clas-sical formulation into linear factors that describe relationships between groups of variables, where each group represents either a set of related variables or a data set. The model also naturally extends canonical cor-relation analysis to more than two sets, in a way that is more flexible than previous extensions. Our solution is formulated as variational inference of a latent variable model with structural sparsity, and it consists of two hierarchical levels: The higher level models the relationships between the groups, whereas the lower models the observed variables given the higher level. We show that the resulting solution solves the group factor analysis problem accurately, outperforming alternative factor analysis based solutions as well as more straightforward implementations of group factor analysis. The method is demonstrated on two life science data sets, one on brain activation and the other on systems biology, illustrating its applicability to the analysis of different types of high-dimensional data sources. Index Terms—factor analysis, multi-view learning, probabilistic algo-rithms, structured sparsity 1

### Multiple Output Regression with Latent Noise

, 2016

"... Abstract In high-dimensional data, structured noise caused by observed and unobserved factors affecting multiple target variables simultaneously, imposes a serious challenge for modeling, by masking the often weak signal. Therefore, (1) explaining away the structured noise in multiple-output regres ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract In high-dimensional data, structured noise caused by observed and unobserved factors affecting multiple target variables simultaneously, imposes a serious challenge for modeling, by masking the often weak signal. Therefore, (1) explaining away the structured noise in multiple-output regression is of paramount importance. Additionally, (2) assumptions about the correlation structure of the regression weights are needed. We note that both can be formulated in a natural way in a latent variable model, in which both the interesting signal and the noise are mediated through the same latent factors. Under this assumption, the signal model then borrows strength from the noise model by encouraging similar effects on correlated targets. We introduce a hyperparameter for the latent signal-to-noise ratio which turns out to be important for modelling weak signals, and an ordered infinitedimensional shrinkage prior that resolves the rotational unidentifiability in reduced-rank regression models. Simulations and prediction experiments with metabolite, gene expression, FMRI measurement, and macroeconomic time series data show that our model equals or exceeds the state-of-the-art performance and, in particular, outperforms the standard approach of assuming independent noise and signal models.

### Expectation Propagation for Likelihoods Depending on an Inner Product of Two Multivariate Random Variables

"... We describe how a deterministic Gaussian posterior approximation can be constructed using expectation propagation (EP) for mod-els, where the likelihood function depends on an inner product of two multivariate random variables. The family of applicable models includes a wide variety of important lin ..."

Abstract
- Add to MetaCart

(Show Context)
We describe how a deterministic Gaussian posterior approximation can be constructed using expectation propagation (EP) for mod-els, where the likelihood function depends on an inner product of two multivariate random variables. The family of applicable models includes a wide variety of important linear latent variable models used in statistical ma-chine learning, such as principal component and factor analysis, their linear extensions, and errors-in-variables regression. The EP computations are facilitated by an integral transformation of the Dirac delta function, which allows transforming the multidimen-sional integrals over the two multivariate ran-dom variables into an analytically tractable form up to one-dimensional analytically in-tractable integrals that can be efficiently computed numerically. We study the result-ing posterior approximations in sparse prin-cipal component analysis with Gaussian and probit likelihoods. Comparisons to Gibbs sampling and variational inference are pre-sented. 1