Results 1  10
of
20
Operations for Learning with Graphical Models
 Journal of Artificial Intelligence Research
, 1994
"... This paper is a multidisciplinary review of empirical, statistical learning from a graphical model perspective. Wellknown examples of graphical models include Bayesian networks, directed graphs representing a Markov chain, and undirected networks representing a Markov field. These graphical models ..."
Abstract

Cited by 276 (13 self)
 Add to MetaCart
(Show Context)
This paper is a multidisciplinary review of empirical, statistical learning from a graphical model perspective. Wellknown examples of graphical models include Bayesian networks, directed graphs representing a Markov chain, and undirected networks representing a Markov field. These graphical models are extended to model data analysis and empirical learning using the notation of plates. Graphical operations for simplifying and manipulating a problem are provided including decomposition, differentiation, and the manipulation of probability models from the exponential family. Two standard algorithm schemas for learning are reviewed in a graphical framework: Gibbs sampling and the expectation maximization algorithm. Using these operations and schemas, some popular algorithms can be synthesized from their graphical specification. This includes versions of linear regression, techniques for feedforward networks, and learning Gaussian and discrete Bayesian networks from data. The paper conclu...
Bayes in the sky: Bayesian inference and model selection in cosmology
 Contemp. Phys
"... The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher ..."
Abstract

Cited by 58 (7 self)
 Add to MetaCart
The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher efficiency and of a consistent conceptual basis for dealing with the problem of induction in the presence of uncertainty. This trend is likely to continue in the future, when the way we collect, manipulate and analyse observations and compare them with theoretical models will assume an even more central role in cosmology. This review is an introduction to Bayesian methods in cosmology and astrophysics and recent results in the field. I first present Bayesian probability theory and its conceptual underpinnings, Bayes ’ Theorem and the role of priors. I discuss the problem of parameter inference and its general solution, along with numerical techniques such as Monte Carlo Markov Chain methods. I then review the theory and application of Bayesian model comparison, discussing the notions of Bayesian evidence and effective model complexity, and how to compute and interpret those quantities. Recent developments in cosmological parameter extraction and Bayesian cosmological model building are summarized, highlighting the challenges that lie ahead.
Bayesian data analysis
, 2009
"... Bayesian methods have garnered huge interest in cognitive science as an approach to models of cognition and perception. On the other hand, Bayesian methods for data analysis have not yet made much headway in cognitive science against the institutionalized inertia of 20th century null hypothesis sign ..."
Abstract

Cited by 32 (7 self)
 Add to MetaCart
(Show Context)
Bayesian methods have garnered huge interest in cognitive science as an approach to models of cognition and perception. On the other hand, Bayesian methods for data analysis have not yet made much headway in cognitive science against the institutionalized inertia of 20th century null hypothesis significance testing (NHST). Ironically, specific Bayesian models of cognition and perception may not long endure the ravages of empirical verification, but generic Bayesian methods for data analysis will eventually dominate. It is time that Bayesian data analysis became the norm for empirical methods in cognitive science. This article reviews a fatal flaw of NHST and introduces the reader to some benefits of Bayesian data analysis. The article presents illustrative examples of multiple comparisons in Bayesian ANOVA and Bayesian approaches to statistical power.
Bayesian inference on compact binary inspiral gravitational radiation signals in interferometric data
, 2006
"... ..."
Bayesian view of solar neutrino oscillations [hepph/0108191
"... We present the results of a Bayesian analysis of solar neutrino data in terms of νe → νµ,τ and νe → νs oscillations, where νs is a sterile neutrino. We perform a Rates Analysis of the rates of solar neutrino experiments, including the first SNO CC result, and spectral data of the CHOOZ experiment, a ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
We present the results of a Bayesian analysis of solar neutrino data in terms of νe → νµ,τ and νe → νs oscillations, where νs is a sterile neutrino. We perform a Rates Analysis of the rates of solar neutrino experiments, including the first SNO CC result, and spectral data of the CHOOZ experiment, and a Global Analysis that takes into account also the SuperKamiokande day and night electron energy spectra. We show that the Bayesian analysis of solar neutrino data does not suffer any problem from the inclusion of the numerous bins of the CHOOZ and SuperKamiokande electron energy spectra and allows to reach the same conclusions on the favored type of neutrino transitions and on the determination of the most favored values of the oscillation parameters in both the Rates and Global Analysis. Our Bayesian analysis shows that νe → νs transitions are strongly disfavored with respect to νe → νµ,τ transitions. In the case of νe → νµ,τ oscillations, the Large Mixing Angle region is favored by the data (86% probability), the LOW region has some small chance (13 % probability), the Vacuum Oscillation region is almost excluded (1 % probability) and the Small Mixing Angle
Sherpa: a missionindependent data analysis application
 Society of PhotoOptical Instrumentation Engineers (SPIE) Conference Series, volume 4477 of Society of PhotoOptical Instrumentation Engineers (SPIE) Conference Series
, 2001
"... ..."
Automated Detection of the Magnetopause for Space Weather from the IMAGE Satellite
"... The Radio Plasma Imager (RPI) is a low power radar on board the IMAGE spacecraft to be launched early in year 2000. The principal science objective of RPI is to characterize the plasma in the Earth's magnetosphere by radio frequency imaging. A key product of RPI is the plasmagram, a map of radi ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The Radio Plasma Imager (RPI) is a low power radar on board the IMAGE spacecraft to be launched early in year 2000. The principal science objective of RPI is to characterize the plasma in the Earth's magnetosphere by radio frequency imaging. A key product of RPI is the plasmagram, a map of radio signal strength vs. echo delaytime vs. frequency, on which magnetospheric structures appear as curves of varying intensity. Noise and other emissions will also appear on RPI plasmagrams and when strong enough will obscure the radar echoes. RPI echoes from the Earth's magnetopause will be of particular importance since the magnetopause is the rst region that the solar wind impacts before producing geomagnetic storms. To aid in the analysis of RPI plasmagrams and nd all echoes from the Earth's magnetopause, a computer program has been developed to automatically detect and enhance the radar echoes. The technique presented is derived within a Bayesian framework and centers on the construction an...
Bayesian Classification(AutoClass):Theory and Results
, 1996
"... We describe AutoClass, an approach to unsupervised classification based upon the classical mixture model, supplemented by a Bayesian method for determining the optimal classes. We include a moderately detailed exposition of the mathematics behind the AutoClass system. We emphasize that no current un ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We describe AutoClass, an approach to unsupervised classification based upon the classical mixture model, supplemented by a Bayesian method for determining the optimal classes. We include a moderately detailed exposition of the mathematics behind the AutoClass system. We emphasize that no current unsupervised classification system can produce maximally useful results when operated alone. It is the interaction between domain experts and the machine searching over the model space, that generates new knowledge. Both bring unique information and abilities to the database analysis task, and each enhances the others' effectiveness. We illustrate this point with several applications of AutoClass to complex real world databases, and describe the resulting successes and failures. 6.1 Introduction This chapter is a summary of our experience in using an automatic classification program (AutoClass) to extract useful information from databases. It also gives an outline of the principles that under...
Making good sense of quantum probabilities
, 2001
"... In the Bayesian approach to probability theory, probability quantifies a degree of belief for a single trial, without any a priori connection to limiting frequencies. Despite being prescribed by a fundamental law, probabilities for individual quantum systems can be understood within the Bayesian app ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
In the Bayesian approach to probability theory, probability quantifies a degree of belief for a single trial, without any a priori connection to limiting frequencies. Despite being prescribed by a fundamental law, probabilities for individual quantum systems can be understood within the Bayesian approach. We argue that the distinction between classical and quantum probabilities lies not in their definition, but in the nature of the information they encode. In the classical world, maximal information about a physical system is complete in the sense of providing definite predictions for all possible questions that can be asked of the system. In the quantum world, maximal information is not complete and cannot be completed. Using this distinction, we show that any Bayesian probability assignment in quantum mechanics must have the form of the quantum probability rule, that maximal information about a quantum system leads to a unique quantumstate assignment, and that quantum theory provides a stronger connection between probability and measured frequency than can be justified classically. Finally we give a Bayesian formulation of quantumstate tomography. There are excellent reasons for interpreting quantum states as states of knowledge. A classic
Bayesian Hypothesis Testing for Planet Detection Isabelle Braems1 Laboratoire d’Etudes des Matriaux HorsEquilibre (LEMHE)
, 2004
"... The past five years has seen a surge in research and innovative ideas for the imaging of extrasolar planets, particular terrestrial ones. We expect that within the next decade a space observatory will be launched with the objective of imaging earthlike planets. Because of the limited lifetime of suc ..."
Abstract
 Add to MetaCart
The past five years has seen a surge in research and innovative ideas for the imaging of extrasolar planets, particular terrestrial ones. We expect that within the next decade a space observatory will be launched with the objective of imaging earthlike planets. Because of the limited lifetime of such a mission and the large number of potential targets, integration time is a critical parameter. In fact, integration time is the primary metric in evaluating various design approaches for the high contrast imaging system. In this paper we present a new approach to determining the existence of a planet in an observed system using Bayesian hypothesis testing. Rather than perform photometry, or rely on vision to determine the existence of a planet, this approach evaluates the image plane data statistically under certain assumptions about the prior probability distributions. We show that extremely high confidence can be achieved in substantially shorter integration times than conventional photometric methods. 1.