• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 6,869
Next 10 →

The Information Criterion

by Masume Ghahramani, M. Ghahramani, Payam Noor
"... The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the estimator of asymptotic unbias for the second term Kullbake-Leibler risk considers the divergence between the true model and offered models. However, it is an inconsistent estimator. A proposed approach t ..."
Abstract - Add to MetaCart
The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the estimator of asymptotic unbias for the second term Kullbake-Leibler risk considers the divergence between the true model and offered models. However, it is an inconsistent estimator. A proposed approach

Information Criterion

by Complete List, Adam Biomathematics, Statistics Scotl, John Wiley, Adam Butlery, Ruth M. Doherty
"... Model averaging to combine simulations of future global vegetation carbon stocks ..."
Abstract - Add to MetaCart
Model averaging to combine simulations of future global vegetation carbon stocks

Model Selection and Model Averaging in Phylogenetics: Advantages of Akaike Information Criterion and Bayesian Approaches Over Likelihood Ratio Tests

by David Posada, Thomas R. Buckley , 2004
"... Model selection is a topic of special relevance in molecular phylogenetics that affects many, if not all, stages of phylogenetic inference. Here we discuss some fundamental concepts and techniques of model selection in the context of phylogenetics. We start by reviewing different aspects of the sel ..."
Abstract - Cited by 407 (8 self) - Add to MetaCart
selection in phylogenetics, and that approaches like the Akaike Information Criterion (AIC) and Bayesian methods offer important advantages. In particular, the latter two methods are able to simultaneously compare multiple nested or nonnested models, assess model selection uncertainty, and allow

The Residual Information Criterion, Corrected

by Chenlei Leng , 711
"... Shi and Tsai (JRSSB, 2002) proposed an interesting residual information criterion (RIC) for model selection in regression. Their RIC was motivated by the principle of minimizing the Kullback-Leibler discrepancy between the residual likelihoods of the true and candidate model. We show, however, under ..."
Abstract - Add to MetaCart
Shi and Tsai (JRSSB, 2002) proposed an interesting residual information criterion (RIC) for model selection in regression. Their RIC was motivated by the principle of minimizing the Kullback-Leibler discrepancy between the residual likelihoods of the true and candidate model. We show, however

Speaker, Environment And Channel Change Detection And Clustering Via The Bayesian Information Criterion

by Scott Shaobing Chen, P. S. Gopalakrishnan , 1998
"... In this paper, we are interested in detecting changes in speaker identity, environmental condition and channel condition; we call this the problem of acoustic change detection. The input audio stream can be modeled as a Gaussian process in the cepstral space. We present a maximum likelihood approach ..."
Abstract - Cited by 272 (2 self) - Add to MetaCart
approach to detect turns of a Gaussian process; the decision of a turn is based on the Bayesian Information Criterion (BIC), a model selection criterion well-known in the statistics literature. The BIC criterion can also be applied as a termination criterion in hierarchical methods for clustering of audio

The Focussed Information Criterion

by Gerda Claeskens, Nils Lid Hjort
"... Abstract. A variety of model selection criteria have been developed, of general and specific types. Most of these aim at selecting a single model with good overall properties, e.g. formulated via average prediction quality or shortest estimated overall distance to the in some sense true model. The A ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
the precision of any submodel-based estimator. The framework is that of large-sample likelihood inference. Using an unbiased estimate of limiting risk, we propose a focussed information criterion for model selection, the FIC. We investigate and discuss properties of the method, establish some connections

Kernel-Based Information Criterion

by Somayeh Danafar, Kenji Fukumizu, Faustino Gomez
"... This paper introduces Kernel-based Information Criterion (KIC) for model selection in regression analysis. The kernel-based complexity measure in KIC efficiently computes the interdependency between parameters of the model using a novel variable-wise variance and yields selection of better, more rob ..."
Abstract - Add to MetaCart
This paper introduces Kernel-based Information Criterion (KIC) for model selection in regression analysis. The kernel-based complexity measure in KIC efficiently computes the interdependency between parameters of the model using a novel variable-wise variance and yields selection of better, more

Kullback-Leibler Information Criterion

by Raffaella Giacomini, Barbara Rossi, We Also Thank B. Hansen, M. Jacoviello, U. Muller, M. Del Negro, G. Primiceri, T. Zha , 2007
"... We propose new methods for analyzing the relative performance of two competing, misspecified models in the presence of possible data instability. The main idea is to develop a measure of the relative “local performance ” for the two models, and to investigate its stability over time by means of stat ..."
Abstract - Add to MetaCart
of statistical tests. The models ’ performance can be evaluated using either in-sample or out-of-sample criteria. In the former case, we suggest using the local Kullback-Leibler information criterion, whereas in the latter, we consider the local out-of-sample forecast loss, for a general loss function. We

Subspace information criterion for model selection

by Masashi Sugiyama, Hidemitsu Ogawa - Neural Computation , 2001
"... The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion for model selection called the subspace information criterion (SIC), which is a generalization of Mallows ’ C L. It is a ..."
Abstract - Cited by 58 (31 self) - Add to MetaCart
The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion for model selection called the subspace information criterion (SIC), which is a generalization of Mallows ’ C L

An Information Criterion for Likelihood Selection

by A. Yuan, B. Clarke - IEEE Trans. Inform. Theory , 1999
"... For a given source distribution, we establish properties of the conditional density achieving the rate distortion function lower bound as the distortion parameter varies. In the limit as the distortion tolerated goes to zero, the conditional density achieving the rate distortion function lower bound ..."
Abstract - Cited by 7 (1 self) - Add to MetaCart
it is impossible to identify a `true' parametric family on the basis of physical modeling, our results provide both data compression and channel coding justication for using the conditional density achieving the rate distortion function lower bound as a likelihood. Index Terms { Mutual information, rate
Next 10 →
Results 1 - 10 of 6,869
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University