• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 9,644
Next 10 →

Alignment by Maximization of Mutual Information

by Paul A. Viola , 1995
"... ..."
Abstract - Cited by 1011 (13 self) - Add to MetaCart
Abstract not found

Word Association Norms, Mutual Information, and Lexicography

by Kenneth Ward Church, Patrick Hanks , 1990
"... This paper will propose an objective measure based on the information theoretic notion of mutual information, for estimating word association norms from computer readable corpora. (The standard method of obtaining word association norms, testing a few thousand subjects on a few hundred words, is b ..."
Abstract - Cited by 1144 (11 self) - Add to MetaCart
This paper will propose an objective measure based on the information theoretic notion of mutual information, for estimating word association norms from computer readable corpora. (The standard method of obtaining word association norms, testing a few thousand subjects on a few hundred words

Multimodality Image Registration by Maximization of Mutual Information

by Frederik Maes, André Collignon, Dirk Vandermeulen, Guy Marchal, Paul Suetens - IEEE TRANSACTIONS ON MEDICAL IMAGING , 1997
"... A new approach to the problem of multimodality medical image registration is proposed, using a basic concept from information theory, mutual information (MI), or relative entropy, as a new matching criterion. The method presented in this paper applies MI to measure the statistical dependence or in ..."
Abstract - Cited by 791 (10 self) - Add to MetaCart
A new approach to the problem of multimodality medical image registration is proposed, using a basic concept from information theory, mutual information (MI), or relative entropy, as a new matching criterion. The method presented in this paper applies MI to measure the statistical dependence

Multi-Modal Volume Registration by Maximization of Mutual Information

by William M. Wells, III, Paul Viola, Ron Kikinis , 1996
"... A new information-theoretic approach is presented for finding the registration of volumetric medical images of differing modalities. Registration is achieved by adjustment of the relative pose until the mutual information between images is maximized. In our derivation of the registration procedure, ..."
Abstract - Cited by 458 (23 self) - Add to MetaCart
A new information-theoretic approach is presented for finding the registration of volumetric medical images of differing modalities. Registration is achieved by adjustment of the relative pose until the mutual information between images is maximized. In our derivation of the registration procedure

Estimation of Entropy and Mutual Information

by Liam Paninski , 2003
"... ..."
Abstract - Cited by 237 (8 self) - Add to MetaCart
Abstract not found

Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy

by Hanchuan Peng, Fuhui Long, Chris Ding - IEEE TRANS. PATTERN ANALYSIS AND MACHINE INTELLIGENCE , 2005
"... Feature selection is an important problem for pattern classification systems. We study how to select good features according to the maximal statistical dependency criterion based on mutual information. Because of the difficulty in directly implementing the maximal dependency condition, we first der ..."
Abstract - Cited by 571 (8 self) - Add to MetaCart
Feature selection is an important problem for pattern classification systems. We study how to select good features according to the maximal statistical dependency criterion based on mutual information. Because of the difficulty in directly implementing the maximal dependency condition, we first

Using mutual information for selecting features in supervised neural net learning

by Roberto Battiti - IEEE TRANSACTIONS ON NEURAL NETWORKS , 1994
"... This paper investigates the application of the mutual infor“ criterion to evaluate a set of candidate features and to select an informative subset to be used as input data for a neural network classifier. Because the mutual information measures arbitrary dependencies between random variables, it is ..."
Abstract - Cited by 358 (1 self) - Add to MetaCart
This paper investigates the application of the mutual infor“ criterion to evaluate a set of candidate features and to select an informative subset to be used as input data for a neural network classifier. Because the mutual information measures arbitrary dependencies between random variables

Mutual information

by unknown authors
"... in w-speaker discourse orhus, 1977). Biber se analysed. Erman i-word verbs, etc.) w, Fine, and Pollio a native speaker in every minute of spoken discourse. Collocation is what makes native speakers ' speech idiomatic, fluent and natural. It is also what often renders second language (L2) learne ..."
Abstract - Add to MetaCart
in w-speaker discourse orhus, 1977). Biber se analysed. Erman i-word verbs, etc.) w, Fine, and Pollio a native speaker in every minute of spoken discourse. Collocation is what makes native speakers ' speech idiomatic, fluent and natural. It is also what often renders second language (L2) learners ' speech awkward, unnatural and even odd. Indeed, it has been established that L2 learners have problems with collocation in their written and spoken language (Granger, 1998; Howarth, 1998; Nesselhauf, 2003, 2005). Some have argued that L2 learners rely on creativity and make “overliberal assumptions about

Mutual-information-based registration of medical images: a survey

by Josien P. W. Pluim, J. B. Antoine Maintz, Max A. Viergever - IEEE TRANSCATIONS ON MEDICAL IMAGING , 2003
"... An overview is presented of the medical image processing literature on mutual-information-based registration. The aim of the survey is threefold: an introduction for those new to the field, an overview for those working in the field, and a reference for those searching for literature on a specific ..."
Abstract - Cited by 302 (3 self) - Add to MetaCart
An overview is presented of the medical image processing literature on mutual-information-based registration. The aim of the survey is threefold: an introduction for those new to the field, an overview for those working in the field, and a reference for those searching for literature on a specific

Distribution of mutual information

by Marcus Hutter - Advances in Neural Information Processing Systems 14: Proceedings of the 2002 Conference , 2002
"... expectation and variance of mutual information. The mutual information of two random variables ı and j with joint probabilities {πij} is commonly used in learning Bayesian nets as well as in many other fields. The chances πij are usually estimated by the empirical sampling frequency nij/n leading to ..."
Abstract - Cited by 50 (12 self) - Add to MetaCart
expectation and variance of mutual information. The mutual information of two random variables ı and j with joint probabilities {πij} is commonly used in learning Bayesian nets as well as in many other fields. The chances πij are usually estimated by the empirical sampling frequency nij/n leading
Next 10 →
Results 1 - 10 of 9,644
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University