Results 1 - 10
of
9,123
Variational algorithms for approximate Bayesian inference
, 2003
"... The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents ..."
Abstract
-
Cited by 440 (9 self)
- Add to MetaCart
a unified variational Bayesian (VB) framework which approximates these computations in models with latent variables using a lower bound on the marginal likelihood. Chapter 1 presents background material on Bayesian inference, graphical models, and propaga-tion algorithms. Chapter 2 forms
Bayesian inference
"... 2. Introduction to Bayesian analysis 3. Asymptotic approach for Bayesian inference ..."
Abstract
- Add to MetaCart
2. Introduction to Bayesian analysis 3. Asymptotic approach for Bayesian inference
Bayesian Inference
"... The Bayesian interpretation of probability is one of two broad categories of interpretations. Bayesian inference updates knowledge about unknowns, parameters, with information from data. The LaplacesDemon package in R enables Bayesian inference, and this vignette provides an introduction to the topi ..."
Abstract
- Add to MetaCart
The Bayesian interpretation of probability is one of two broad categories of interpretations. Bayesian inference updates knowledge about unknowns, parameters, with information from data. The LaplacesDemon package in R enables Bayesian inference, and this vignette provides an introduction
A family of algorithms for approximate Bayesian inference
, 2001
"... One of the major obstacles to using Bayesian methods for pattern recognition has been its computational expense. This thesis presents an approximation technique that can perform Bayesian inference faster and more accurately than previously possible. This method, "Expectation Propagation," ..."
Abstract
-
Cited by 366 (11 self)
- Add to MetaCart
One of the major obstacles to using Bayesian methods for pattern recognition has been its computational expense. This thesis presents an approximation technique that can perform Bayesian inference faster and more accurately than previously possible. This method, "Expectation Propagation
Bayesian inference . . .
, 2007
"... The subject of this work is the parametric inference problem, i.e. how to infer from data on the parameters of the data likelihood of a random process whose parametric form is known a priori. The assumption that Bayes’ theorem has to be used to add new data samples reduces the problem to the questi ..."
Abstract
- Add to MetaCart
of the data likelihood. Altogether we get a Bayesian inference procedure that incorporates special prior knowledge if available but has also a sound solution if not, and leaves no hyperparameters unspecified.
Bayesian inference in econometric models using monte carlo integration.
- Econometrica
, 1989
"... ..."
Hierarchical Bayesian Inference in the Visual Cortex
, 2002
"... this paper, we propose a Bayesian theory of hierarchical cortical computation based both on (a) the mathematical and computational ideas of computer vision and pattern the- ory and on (b) recent neurophysiological experimental evidence. We ,2 have proposed that Grenander's pattern theory 3 coul ..."
Abstract
-
Cited by 300 (2 self)
- Add to MetaCart
this paper, we propose a Bayesian theory of hierarchical cortical computation based both on (a) the mathematical and computational ideas of computer vision and pattern the- ory and on (b) recent neurophysiological experimental evidence. We ,2 have proposed that Grenander's pattern theory 3
BAYESIAN INFERENCE
"... In a paper that has been widely-cited within the philosophy of science community, Glymour1 claims to show that Bayesians cannot learn from old data. His argument contains elementary errors, ones which E. T. Jaynes and others have often warned against. I explain exactly where Glymour went wrong, and ..."
Abstract
- Add to MetaCart
In a paper that has been widely-cited within the philosophy of science community, Glymour1 claims to show that Bayesians cannot learn from old data. His argument contains elementary errors, ones which E. T. Jaynes and others have often warned against. I explain exactly where Glymour went wrong
Bayesian Inference
"... Submitted to Neural Computation This paper argues that many visual scenes are based on a ”Manhattan ” three-dimensional grid which imposes regularities on the image statistics. We construct a Bayesian model which implements this assumption and estimates the viewer orientation relative to the Manhatt ..."
Abstract
- Add to MetaCart
Submitted to Neural Computation This paper argues that many visual scenes are based on a ”Manhattan ” three-dimensional grid which imposes regularities on the image statistics. We construct a Bayesian model which implements this assumption and estimates the viewer orientation relative
Bayesian inference for
, 2008
"... ISSN 1476-2986Abstract 3 Bayes ’ theorem is the cornerstone of statistical inference. It provides the tools for dealing with knowledge in an uncertain world, allowing us to explain observed phenomena through the refinement of belief in model parameters. At the heart of this elegant framework lie int ..."
Abstract
- Add to MetaCart
ISSN 1476-2986Abstract 3 Bayes ’ theorem is the cornerstone of statistical inference. It provides the tools for dealing with knowledge in an uncertain world, allowing us to explain observed phenomena through the refinement of belief in model parameters. At the heart of this elegant framework lie
Results 1 - 10
of
9,123