Results 1 - 10
of
655
Kullback-Leibler Information Criterion
, 2007
"... We propose new methods for analyzing the relative performance of two competing, misspecified models in the presence of possible data instability. The main idea is to develop a measure of the relative “local performance ” for the two models, and to investigate its stability over time by means of stat ..."
Abstract
- Add to MetaCart
of statistical tests. The models ’ performance can be evaluated using either in-sample or out-of-sample criteria. In the former case, we suggest using the local Kullback-Leibler information criterion, whereas in the latter, we consider the local out-of-sample forecast loss, for a general loss function. We
Image Recognition Using Kullback-Leibler Information Discrimination
"... Abstract. The problem of automatic image recognition based on the minimum information discrimination principle is formulated and solved. Color histograms comparison in the Kullback–Leibler information metric is proposed. It’s combined with method of directed enumeration alternatives as opposed to co ..."
Abstract
- Add to MetaCart
Abstract. The problem of automatic image recognition based on the minimum information discrimination principle is formulated and solved. Color histograms comparison in the Kullback–Leibler information metric is proposed. It’s combined with method of directed enumeration alternatives as opposed
Bootstrap estimate of Kullback-Leibler information for model selection
- Statistica Sinica
, 1997
"... Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself ..."
Abstract
-
Cited by 27 (0 self)
- Add to MetaCart
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself
GRADE ESTIMATION OF KULLBACK-LEIBLER INFORMATION NeTMBEW BY
"... Abstract. An estimator of the Kullback-Leibler information number by using its representation as a functional of the grade density is introduced. Its strong consistency is proved under the mild conditions on the grade density. The same approach is used to study the entropy measure of bivariate depen ..."
Abstract
- Add to MetaCart
Abstract. An estimator of the Kullback-Leibler information number by using its representation as a functional of the grade density is introduced. Its strong consistency is proved under the mild conditions on the grade density. The same approach is used to study the entropy measure of bivariate
Maximum likelihood estimator and Kullback-Leibler information in misspecified Markov chain models
- Teor. Veroyatnost. i Primenen
"... Suppose we have specified a parametric model for the transition distribution of a Markov chain, but that the true transition distribution does not belong to the model. Then the maximum likelihood estimator estimates the parameter which maximizes the Kullback--Leibler information between the true tra ..."
Abstract
-
Cited by 6 (4 self)
- Add to MetaCart
Suppose we have specified a parametric model for the transition distribution of a Markov chain, but that the true transition distribution does not belong to the model. Then the maximum likelihood estimator estimates the parameter which maximizes the Kullback--Leibler information between the true
Automatic Selection of Parameters in Spline Regression via Kullback-Leibler Information
, 1993
"... Based on Kullback-Leibler information we propose a data-driven selector, called GAIC (c) , for choosing parameters of regression splines in nonparametric regression via a stepwise forward/backward knot placement and deletion strategy [1] . This criterion unifies the commonly used information cr ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
Based on Kullback-Leibler information we propose a data-driven selector, called GAIC (c) , for choosing parameters of regression splines in nonparametric regression via a stepwise forward/backward knot placement and deletion strategy [1] . This criterion unifies the commonly used information
STATE SPACE TIME SERIES CLUSTERING USING DISCREPANCIES BASED ON THE KULLBACK-LEIBLER INFORMATION AND THE MAHALANOBIS DISTANCE
, 2012
"... State space time series clustering using discrepancies based on the Kullback-Leibler information and the Mahalanobis distance ..."
Abstract
- Add to MetaCart
State space time series clustering using discrepancies based on the Kullback-Leibler information and the Mahalanobis distance
Kullback-Leibler Information in Multidimensional Adaptive Testing: Theory and Application
"... Abstract Built on multidimensional item response theory (MIRT), multidimensional adaptive testing (MAT) can, in principle, provide a promising choice to ensuring efficient estimation of each ability dimension in a multidimensional vector. Currently, two item selection procedures have been developed ..."
Abstract
- Add to MetaCart
developed for MAT, one based on Fisher information embedded within a Bayesian framework, and the other powered by Kullback-Leibler (KL) information. It is well-known that in unidimensional IRT that the second derivative of KL information (also termed "global information") is Fisher information
Parsimonious Estimation of Multiplicative Interaction in Analysis of Variance using Kullback-Leibler Information
- Journal of Statistical Planning and Inference
, 1999
"... Many standard methods for modeling interaction in two way ANOVA require mn interaction parameters, where m and n are the number of rows and columns in the table. By viewing the interaction parameters as a matrix and performing a singular value decomposition, one arrives at the Additive Main Effec ..."
Abstract
-
Cited by 4 (1 self)
- Add to MetaCart
Many standard methods for modeling interaction in two way ANOVA require mn interaction parameters, where m and n are the number of rows and columns in the table. By viewing the interaction parameters as a matrix and performing a singular value decomposition, one arrives at the Additive Main Effects and Multiplicative Interaction (AMMI) model which is commonly used in agriculture. By using only those interaction components with the largest singular values, one can produce an estimate of interaction that requires far fewer than mn parameters while retaining most of the explanatory power of standard methods. The central inference problems of estimating the parameters and determining the number of interaction components has been difficult except in "ideal" situations (equal cell sizes, equal variance, etc.). The Bayesian methodology developed in this paper applies for unequal sample sizes and heteroscedastic data, and may be easily generalized to more complicated data structures...
Results 1 - 10
of
655