Results 1 - 10
of
9,956
The Marginalized Likelihood Ratio Test For Detecting Abrupt Changes
- IEEE TRANS. AUTOMATIC CONTROL
, 1996
"... The generalized likelihood ratio (GLR) test is a widely used method for detecting abrupt changes in linear systems and signals. In this paper the marginalized likelihood ratio (MLR) test is introduced for eliminating three shortcomings of GLR, while preserving its applicability and generality. First ..."
Abstract
-
Cited by 31 (3 self)
- Add to MetaCart
The generalized likelihood ratio (GLR) test is a widely used method for detecting abrupt changes in linear systems and signals. In this paper the marginalized likelihood ratio (MLR) test is introduced for eliminating three shortcomings of GLR, while preserving its applicability and generality
Stochastic volatility: likelihood inference and comparison with ARCH models
- Review of Economic Studies
, 1998
"... In this paper, Markov chain Monte Carlo sampling methods are exploited to provide a unified, practical likelihood-based framework for the analysis of stochastic volatility models. A highly effective method is developed that samples all the unobserved volatilities at once using an approximating offse ..."
Abstract
-
Cited by 592 (40 self)
- Add to MetaCart
-nested likelihood ratios and Bayes factors is also investigated. These methods are used to compare the fit of stochastic volatility and GARCH models. All the procedures are illustrated in detail. 1.
PAML: a program package for phylogenetic analysis by maximum likelihood
- COMPUT APPL BIOSCI 13:555–556
, 1997
"... PAML, currently in version 1.2, is a package of programs for phylogenetic analyses of DNA and protein sequences using the method of maximum likelihood (ML). The programs can be used for (i) maximum likelihood estimation of evolutionary parameters such as branch lengths in a phylogenetic tree, the tr ..."
Abstract
-
Cited by 1459 (17 self)
- Add to MetaCart
, the transition/transversion rate ratio, the shape parameter of the gamma distribution for variable evolutionary rates at sites, and rate parameters for different genes; (ii) likelihood ratio test of hypotheses concerning sequence evolution, such as rate con-stancy and independence among sites and rate constancy
Speaker verification using Adapted Gaussian mixture models
- Digital Signal Processing
, 2000
"... In this paper we describe the major elements of MIT Lincoln Laboratory’s Gaussian mixture model (GMM)-based speaker verification system used successfully in several NIST Speaker Recognition Evaluations (SREs). The system is built around the likelihood ratio test for verification, using simple but ef ..."
Abstract
-
Cited by 1010 (42 self)
- Add to MetaCart
In this paper we describe the major elements of MIT Lincoln Laboratory’s Gaussian mixture model (GMM)-based speaker verification system used successfully in several NIST Speaker Recognition Evaluations (SREs). The system is built around the likelihood ratio test for verification, using simple
Gaussian processes for machine learning
, 2003
"... We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperparameters us ..."
Abstract
-
Cited by 720 (2 self)
- Add to MetaCart
using the marginal likelihood. We explain the practical advantages of Gaussian Process and end with conclusions and a look at the current trends in GP work.
Statistical Analysis of Cointegrated Vectors
- Journal of Economic Dynamics and Control
, 1988
"... We consider a nonstationary vector autoregressive process which is integrated of order 1, and generated by i.i.d. Gaussian errors. We then derive the maximum likelihood estimator of the space of cointegration vectors and the likelihood ratio test of the hypothesis that it has a given number of dimen ..."
Abstract
-
Cited by 2749 (12 self)
- Add to MetaCart
We consider a nonstationary vector autoregressive process which is integrated of order 1, and generated by i.i.d. Gaussian errors. We then derive the maximum likelihood estimator of the space of cointegration vectors and the likelihood ratio test of the hypothesis that it has a given number
Accurate Methods for the Statistics of Surprise and Coincidence
- COMPUTATIONAL LINGUISTICS
, 1993
"... Much work has been done on the statistical analysis of text. In some cases reported in the literature, inappropriate statistical methods have been used, and statistical significance of results have not been addressed. In particular, asymptotic normality assumptions have often been used unjustifiably ..."
Abstract
-
Cited by 1057 (1 self)
- Add to MetaCart
unjustifiably, leading to flawed results.This assumption of normal distribution limits the ability to analyze rare events. Unfortunately rare events do make up a large fraction of real text.However, more applicable methods based on likelihood ratio tests are available that yield good results with relatively
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building large-scale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract
-
Cited by 819 (28 self)
- Add to MetaCart
likelihoods, marginal probabilities and most probable configurations. We describe how a wide varietyof algorithms — among them sum-product, cluster variational methods, expectation-propagation, mean field methods, max-product and linear programming relaxation, as well as conic programming relaxations — can
Normalization for cDNA microarray data: a robust composite method addressing single and multiple slide systematic variation
, 2002
"... There are many sources of systematic variation in cDNA microarray experiments which affect the measured gene expression levels (e.g. differences in labeling efficiency between the two fluorescent dyes). The term normalization refers to the process of removing such variation. A constant adjustment is ..."
Abstract
-
Cited by 718 (9 self)
- Add to MetaCart
is often used to force the distribution of the intensity log ratios to have a median of zero for each slide. However, such global normalization approaches are not adequate in situations where dye biases can depend on spot overall intensity and/or spatial location within the array. This article proposes
Results 1 - 10
of
9,956