Results 1  10
of
1,203,856
Partiallyspecified probabilities: decisions and games
, 2009
"... In Ellsberg paradox, decision makers that are partially informed about the actual probability distribution violate the expected utility paradigm. This paper develops a theory of decision making with a partially specified probability. The paper takes an axiomatic approach using AnscombeAumann’s (196 ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
In Ellsberg paradox, decision makers that are partially informed about the actual probability distribution violate the expected utility paradigm. This paper develops a theory of decision making with a partially specified probability. The paper takes an axiomatic approach using Anscombe
Abstract: Partiallyspecified probabilities: decisions and games
, 2007
"... In Ellsberg paradox, decision makers that are partially informed about the actual probability distribution violate the expected utility paradigm. This paper develops a theory of decision making with a partially specified probability. The paper takes an axiomatic approach using AnscombeAumann’s (196 ..."
Abstract
 Add to MetaCart
In Ellsberg paradox, decision makers that are partially informed about the actual probability distribution violate the expected utility paradigm. This paper develops a theory of decision making with a partially specified probability. The paper takes an axiomatic approach using Anscombe
Partially Specified Probabilities: Decisions and Games,” mimeo
, 2007
"... A very preliminary draft. Comments are welcome. Abstract. In Ellsberg paradox decision makers that are partially informed about the actual probability distribution violate expected utility paradigm. This paper develops a theory of decision making with a partially specified probability. The paper tak ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
A very preliminary draft. Comments are welcome. Abstract. In Ellsberg paradox decision makers that are partially informed about the actual probability distribution violate expected utility paradigm. This paper develops a theory of decision making with a partially specified probability. The paper
Learning the Kernel Matrix with SemiDefinite Programming
, 2002
"... Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information ..."
Abstract

Cited by 780 (22 self)
 Add to MetaCart
Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information
Estimating the Support of a HighDimensional Distribution
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propo ..."
Abstract

Cited by 766 (29 self)
 Add to MetaCart
Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We
Using Daily Stock Returns: The Case of Event Studies
 Journal of Financial Economics
, 1985
"... This paper examines properties of daily stock returns and how the particular characteristics of these data affect event study methodologies. Daily data generally present few difficulties for event studies. Standard procedures are typically wellspecified even when special daily data characteristics ..."
Abstract

Cited by 763 (2 self)
 Add to MetaCart
This paper examines properties of daily stock returns and how the particular characteristics of these data affect event study methodologies. Daily data generally present few difficulties for event studies. Standard procedures are typically wellspecified even when special daily data characteris
Bayes Factors
, 1995
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 1766 (74 self)
 Add to MetaCart
In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null
Markov Random Field Models in Computer Vision
, 1994
"... . A variety of computer vision problems can be optimally posed as Bayesian labeling in which the solution of a problem is defined as the maximum a posteriori (MAP) probability estimate of the true labeling. The posterior probability is usually derived from a prior model and a likelihood model. The l ..."
Abstract

Cited by 515 (18 self)
 Add to MetaCart
. A variety of computer vision problems can be optimally posed as Bayesian labeling in which the solution of a problem is defined as the maximum a posteriori (MAP) probability estimate of the true labeling. The posterior probability is usually derived from a prior model and a likelihood model
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing
Exploiting Generative Models in Discriminative Classifiers
 In Advances in Neural Information Processing Systems 11
, 1998
"... Generative probability models such as hidden Markov models provide a principled way of treating missing information and dealing with variable length sequences. On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often resu ..."
Abstract

Cited by 538 (11 self)
 Add to MetaCart
Generative probability models such as hidden Markov models provide a principled way of treating missing information and dealing with variable length sequences. On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often
Results 1  10
of
1,203,856