Results 1 
2 of
2
Deep Directed Generative Autoencoders
"... For discrete data, the likelihood P (x) can be rewritten exactly and parametrized ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
For discrete data, the likelihood P (x) can be rewritten exactly and parametrized
NICE: Nonlinear Independent Components Estimation
"... We propose a deep learning framework for modeling complex highdimensional densities via Nonlinear Independent Component Estimation (NICE). It is based on the idea that a good representation is one in which the data has a distribution that is easy to model. For this purpose, a nonlinear determinis ..."
Abstract
 Add to MetaCart
(Show Context)
We propose a deep learning framework for modeling complex highdimensional densities via Nonlinear Independent Component Estimation (NICE). It is based on the idea that a good representation is one in which the data has a distribution that is easy to model. For this purpose, a nonlinear deterministic transformation of the data is learned that maps it to a latent space so as to make the transformed data conform to a factorized distribution, i.e., resulting in independent latent variables. We parametrize this transformation so that computing the determinant of the Jacobian and inverse Jacobian is trivial, yet we maintain the ability to learn complex nonlinear transformations, via a composition of simple building blocks, each based on a deep neural network. The training criterion is simply the exact loglikelihood, which is tractable, and unbiased ancestral sampling is also easy. We show that this approach yields good generative models on four image datasets and can be used for inpainting. 1