Results 1  10
of
931
Bayesian density estimation and inference using mixtures.
 J. Amer. Statist. Assoc.
, 1995
"... JSTOR is a notforprofit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about J ..."
Abstract

Cited by 653 (18 self)
 Add to MetaCart
(Show Context)
JSTOR is a notforprofit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org. We describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes. These models provide natural settings for density estimation and are exemplified by special cases where data are modeled as a sample from mixtures of normal distributions. Efficient simulation methods are used to approximate various prior, posterior, and predictive distributions. This allows for direct inference on a variety of practical issues, including problems of local versus global smoothing, uncertainty about density estimates, assessment of modality, and the inference on the numbers of components. Also, convergence results are established for a general class of normal mixture models. American Statistical Association
MulticastBased Inference of NetworkInternal Characteristics: Accuracy of Packet Loss Estimation
 IEEE Transactions on Information Theory
, 1998
"... We explore the use of endtoend multicast traffic as measurement probes to infer networkinternal characteristics. We have developed in an earlier paper [2] a Maximum Likelihood Estimator for packet loss rates on individual links based on losses observed by multicast receivers. This technique explo ..."
Abstract

Cited by 323 (40 self)
 Add to MetaCart
(Show Context)
We explore the use of endtoend multicast traffic as measurement probes to infer networkinternal characteristics. We have developed in an earlier paper [2] a Maximum Likelihood Estimator for packet loss rates on individual links based on losses observed by multicast receivers. This technique exploits the inherent correlation between such observations to infer the performance of paths between branch points in the multicast tree spanning the probe source and its receivers. We evaluate through analysis and simulation the accuracy of our estimator under a variety of network conditions. In particular, we report on the error between inferred loss rates and actual loss rates as we vary the network topology, propagation delay, packet drop policy, background traffic mix, and probe traffic type. In all but one case, estimated losses and probe losses agree to within 2 percent on average. We feel this accuracy is enough to reliably identify congested links in a widearea internetwork. KeywordsInternet performance, endtoend measurements, Maximum Likelihood Estimator, tomography I.
Longest increasing subsequences: from patience sorting to the BaikDeiftJohansson theorem
 BULL. AMER. MATH. SOC. (N.S
, 1999
"... We describe a simple oneperson card game, patience sorting. Its analysis leads to a broad circle of ideas linking Young tableaux with the longest increasing subsequence of a random permutation via the Schensted correspondence. A recent highlight of this area is the work of BaikDeiftJohansson wh ..."
Abstract

Cited by 183 (2 self)
 Add to MetaCart
(Show Context)
We describe a simple oneperson card game, patience sorting. Its analysis leads to a broad circle of ideas linking Young tableaux with the longest increasing subsequence of a random permutation via the Schensted correspondence. A recent highlight of this area is the work of BaikDeiftJohansson which yields limiting probability laws via hard analysis of Toeplitz determinants.
General state space Markov chains and MCMC algorithm
 PROBABILITY SURVEYS
, 2004
"... This paper surveys various results about Markov chains on general (noncountable) state spaces. It begins with an introduction to Markov chain Monte Carlo (MCMC) algorithms, which provide the motivation and context for the theory which follows. Then, sufficient conditions for geometric and uniform e ..."
Abstract

Cited by 177 (35 self)
 Add to MetaCart
This paper surveys various results about Markov chains on general (noncountable) state spaces. It begins with an introduction to Markov chain Monte Carlo (MCMC) algorithms, which provide the motivation and context for the theory which follows. Then, sufficient conditions for geometric and uniform ergodicity are presented, along with quantitative bounds on the rate of convergence to stationarity. Many of these results are proved using direct coupling constructions based on minorisation and drift conditions. Necessary and sufficient conditions for Central Limit Theorems (CLTs) are also presented, in some cases proved via the Poisson Equation or direct regeneration constructions. Finally, optimal scaling and weak convergence results for MetropolisHastings algorithms are discussed. None of the results presented is new, though many of the proofs are. We also describe some Open Problems.
The objective method: Probabilistic combinatorial optimization and local weak convergence
, 2003
"... ..."
(Show Context)
The practical implementation of Bayesian model selection,” manuscript available at http://gsbwww.uchicago.edu/fac/robert.mcculloch/research/papers/index.html.
, 2001
"... Abstract In principle, the Bayesian approach to model selection is straightforward. Prior probability distributions are used to describe the uncertainty surrounding all unknowns. After observing the data, the posterior distribution provides a coherent post data summary of the remaining uncertainty ..."
Abstract

Cited by 132 (3 self)
 Add to MetaCart
(Show Context)
Abstract In principle, the Bayesian approach to model selection is straightforward. Prior probability distributions are used to describe the uncertainty surrounding all unknowns. After observing the data, the posterior distribution provides a coherent post data summary of the remaining uncertainty which is relevant for model selection. However, the practical implementation of this approach often requires carefully tailored priors and novel posterior calculation methods. In this article, we illustrate some of the fundamental practical issues that arise for two different model selection problems: the variable selection problem for the linear model and the CART model selection problem.
Stability and Performance Analysis of Networks Supporting Elastic Services
 IEEE/ACM Transactions on Networking
, 2001
"... AbstractWe consider the stability and performance of a model for networks supporting services that adapt their transmission to the available bandwidth. Not unlike real networks, in our model, connection arrivals are stochastic, each has a random amount of data to send, and the number of ongoing co ..."
Abstract

Cited by 118 (6 self)
 Add to MetaCart
(Show Context)
AbstractWe consider the stability and performance of a model for networks supporting services that adapt their transmission to the available bandwidth. Not unlike real networks, in our model, connection arrivals are stochastic, each has a random amount of data to send, and the number of ongoing connections in the system changes over time. Consequently, the bandwidth allocated to, or throughput achieved by, a given connection may change during its lifetime as feedback control mechanisms react to network loads. Ideally, if there were a fixed number of ongoing connections, such feedback mechanisms would reach an equilibrium bandwidth allocation typically characterized in terms of its &quot;fairness &quot; to users, e.g., maxmin or proportionally fair. In this paper we prove the stability of such networks when the offered load on each link does not exceed its capacity. We use simulation to investigate performance, in terms of average connection delays, for various fairness criteria. Finally, we pose an architectural problem in TCP/IPs decoupling of the transport and network layer from the point of view of guaranteeing connectionlevel stability, which we claim may explain congestion phenomena on the Internet. Index TermsABR service, bandwidth allocation, Lyapunov functions, performance analysis, proportional fairness, rate control, stability, TCP/IP, weighted maxmin fairness. F I.
Learning nearoptimal policies with Bellmanresidual minimization based fitted policy iteration and a single sample path
 MACHINE LEARNING JOURNAL (2008) 71:89129
, 2008
"... ..."
(Show Context)