Results 1 
9 of
9
J.P.W.: Registration of cervical MRI using multifeature mutual information
 IEEE transactions on medical imaging
, 2009
"... Abstract—Radiation therapy for cervical cancer can benefit from image registration in several ways, for example by studying the motion of organs, or by (partially) automating the delineation of the target volume and other structures of interest. In this paper, the registration of cervical data is a ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Radiation therapy for cervical cancer can benefit from image registration in several ways, for example by studying the motion of organs, or by (partially) automating the delineation of the target volume and other structures of interest. In this paper, the registration of cervical data is addressed using mutual information (MI) of not only image intensity, but also features that describe local image structure. Three aspects of the registration are addressed to make this approach feasible. Firstly, instead of relying on a histogrambased estimation of mutual information, which poses problems for a larger number of features, a graphbased implementation of αmutual information (αMI) is employed. Secondly, the analytical derivative of αMI is derived. This makes it possible to use a stochastic gradient descent method to solve the registration problem, which is substantially faster than nonderivativebased methods. Thirdly, the feature space is reduced by means of a principal component analysis, which also decreases the registration time. The proposed technique is compared to a standard approach, based on the mutual information of image intensity only. Experiments are performed on 93 T2weighted MR clinical data sets acquired from 19 patients with cervical cancer. Several characteristics of the proposed algorithm are studied on a subset of 19 image pairs (one pair per patient). On the remaining data (36 image pairs, one or two pairs per patient) the median overlap is shown to improve significantly compared to standard MI from 0.85 to 0.86 for the clinical target volume (CTV, p = 2 ·10−2), from 0.75 to 0.81 for the bladder (p = 8 · 10−6) and from 0.76 to 0.77 for the rectum (p = 2 · 10−4). The registration error is improved at important tissue interfaces, such as that of the bladder with the CTV, and the interface of the rectum with the uterus and cervix. Index Terms—nonrigid registration, cervical cancer, radiation therapy, Shannon mutual information, αmutual information, local image structure, kNN graphs I.
GLOBAL PERFORMANCE PREDICTION FOR DIVERGENCEBASED IMAGE REGISTRATION CRITERIA
"... Divergence measures find application in many areas of statistics, signal processing and machine learning, thus necessitating the need for good estimators of divergence measures. While several estimators of divergence measures have been proposed in literature, the performance of these estimators is n ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Divergence measures find application in many areas of statistics, signal processing and machine learning, thus necessitating the need for good estimators of divergence measures. While several estimators of divergence measures have been proposed in literature, the performance of these estimators is not known. We propose a simple kNN density estimation based plugin estimator for estimation of divergence measures. Based on the properties of kNN density estimates, we derive the bias, variance and mean square error of the estimator in terms of the sample size, the dimension of the samples and the underlying probability distribution. Based on these results, we specify the optimal choice of tuning parameters for minimum mean square error. We also present results on convergence in distribution of the proposed estimator. These results will establish a basis for analyzing the performance of image registration methods that maximize divergence. Index Terms — divergence estimation, performance characterization, plugin estimators, kNN density estimators 1.
knearest neighbor estimation of entropies with confidence
, 2011
"... We analyze a knearest neighbor (kNN) class of plugin estimators for estimating Shannon entropy and Rényi entropy. Based on the statistical properties of kNN balls, we derive explicit rates for the bias and variance of these plugin estimators in terms of the sample size, the dimension of the sam ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
We analyze a knearest neighbor (kNN) class of plugin estimators for estimating Shannon entropy and Rényi entropy. Based on the statistical properties of kNN balls, we derive explicit rates for the bias and variance of these plugin estimators in terms of the sample size, the dimension of the samples and the underlying probability distribution. In addition, we establish a central limit theorem for the plugin estimator that allows us to specify confidence intervals on the entropy functionals. As an application, we use our theory in anomaly detection problems to specify thresholds for achieving desired false alarm rates.
Estimation of nonlinear functionals of densities with confidence
, 2012
"... This paper introduces a class of knearest neighbor (kNN) estimators called bipartite plugin (BPI) estimators for estimating integrals of nonlinear functions of a probability density, such as Shannon entropy and Rényi entropy. The density is assumed to be smooth, have bounded support, and be unif ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This paper introduces a class of knearest neighbor (kNN) estimators called bipartite plugin (BPI) estimators for estimating integrals of nonlinear functions of a probability density, such as Shannon entropy and Rényi entropy. The density is assumed to be smooth, have bounded support, and be uniformly bounded from below on this set. Unlike previous kNN estimators of nonlinear density functionals, the proposed estimator uses datasplitting and boundary correction to achieve lower mean square error. Specifically, we assume that T i.i.d. samples Xi ∈ R d from the density are split into two pieces of cardinality M and N respectively, with M samples used for computing a knearestneighbor density estimate and the remaining N samples used for empirical estimation of the integral of the density functional. By studying the statistical properties of kNN balls, explicit rates for the bias and variance of the BPI estimator are derived in terms of the sample size, the dimension of the samples and the underlying probability distribution. Based on these results, it is possible to specify optimal choice of tuning parameters M/T, k for maximizing the rate of decrease of the mean square error (MSE). The resultant optimized BPI estimator converges faster and achieves lower mean squared error than previous kNN entropy estimators. In addition, a central limit theorem is established for the BPI estimator that allows us to specify tight asymptotic confidence intervals.
Color Invariant Object Recognition Using Entropic Graphs
, 2005
"... ABSTRACT: We present an object recognition approach using higherorder color invariant features with an entropybased similarity measure. Entropic graphs offer an unparameterized alternative to common entropy estimation techniques, such as a histogram or assuming a probability distribution. An entro ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
ABSTRACT: We present an object recognition approach using higherorder color invariant features with an entropybased similarity measure. Entropic graphs offer an unparameterized alternative to common entropy estimation techniques, such as a histogram or assuming a probability distribution. An entropic graph estimates entropy from a spanning graph structure of sample data. We extract color invariant features from object images invariant to illumination changes in intensity, viewpoint, and shading. The Henze–Penrose similarity measure is used to estimate the similarity of two images. Our method is evaluated on the ALOI collection, a large collection of object images. This object image collection consists of 1000 objects recorded under various imaging circumstances. The proposed method is shown to be effective under a wide variety of imaging conditions.
Empirical estimation of entropy functionals with
"... Nonparametric estimation of functionals of density from finite number of samples is an important tool in domains such as statistics, signal processing and machine learning. While several estimators have been proposed in literature, the performance of these estimators is not known. We propose a kNN c ..."
Abstract
 Add to MetaCart
(Show Context)
Nonparametric estimation of functionals of density from finite number of samples is an important tool in domains such as statistics, signal processing and machine learning. While several estimators have been proposed in literature, the performance of these estimators is not known. We propose a kNN class of plugin estimators for estimating nonlinear functionals of density, such as entropy, mutual information and support set dimension. The plugin estimators are designed to automatically incorporate boundary corrections for densities with finite support. Based on the statistical properties of kNN density estimators, we derive the bias and variance of the plugin estimator in terms of the sample size, the dimension of the samples and the underlying probability distribution. We also establish a central limit theorem for the plugin estimators. Based on these results, we specify the optimal choice of tuning parameters for minimum mean square error. The theory is illustrated by applications to problems such as intrinsic dimension estimation and structure discovery in high dimensional data. 1
Empirical estimation of entropy
, 2011
"... This paper introduces a class of knearest neighbor (kNN) estimators called bipartite plugin (BPI) estimators for estimating integrals of nonlinear functions of a probability density, such as Shannon entropy and Rényi entropy. The density is assumed to be smooth, have bounded support, and be u ..."
Abstract
 Add to MetaCart
(Show Context)
This paper introduces a class of knearest neighbor (kNN) estimators called bipartite plugin (BPI) estimators for estimating integrals of nonlinear functions of a probability density, such as Shannon entropy and Rényi entropy. The density is assumed to be smooth, have bounded support, and be uniformly bounded from below on this set. Unlike previous kNN estimators of nonlinear density functionals, the proposed estimator uses datasplitting and boundary correction to achieve lower mean square error. Specifically, we assume that T i.i.d. samples Xi ∈ Rd from the density are split into two pieces of cardinality M and N respectively, with M samples used for computing a knearestneighbor density estimate and the remaining N samples used for empirical estimation of the integral of the density functional. By studying the statistical properties of kNN balls, explicit rates for the bias and variance of the BPI estimator are derived in terms of the sample size, the dimension of the samples and the underlying probability distribution. Based on these results, it is possible to specify optimal choice of tuning parameters M/T, k for maximizing the rate of decrease of the mean square error (MSE). The resultant optimized BPI estimator converges faster and achieves lower mean squared error than previous kNN entropy estimators. In addition, a central limit theorem is established for the BPI estimator that allows us to specify tight asymptotic confidence intervals. 1
PERFORMANCEDRIVEN ENTROPIC INFORMATION FUSION
"... Advances in technology have resulted in acquisition and subsequent fusion of data from multiple sensors of possibly different modalities. Fusing data acquired from different sensors occurs near the front end of sensing systems and therefore can become a critical bottleneck. It is therefore crucial t ..."
Abstract
 Add to MetaCart
(Show Context)
Advances in technology have resulted in acquisition and subsequent fusion of data from multiple sensors of possibly different modalities. Fusing data acquired from different sensors occurs near the front end of sensing systems and therefore can become a critical bottleneck. It is therefore crucial to quantify the performance of sensor fusion. Information fusion involves estimating and optimizing an information criterion over a transformation that maps data from from one sensor data to another. It is crucial to the task of fusion to estimate divergence to a high degree of accuracy and to quantify error in the estimate. To this end, we propose a class of plugin estimators based on knearest neighbor (kNN) graphs for estimating divergence. For this class of estimators, we derive a large sample theory for the bias and variance and develop a joint central limit theorem for the distribution of the estimators over the domain of the transformation space. In this paper, we apply our theory to two applications: (i) detection of anomalies in wireless sensor networks and (ii) fusion of hyperspectral images of geographic images using intrinsic dimension. Index Terms — Information fusion, dimension estimation, entropy estimation, kNN density estimation, plugin estimation, central limit theorem, confidence intervals 1.