#### DMCA

## Multimodality (2001)

### Citations

4673 | A computational approach to edge detection
- Canny
- 1986
(Show Context)
Citation Context ...d in order to eliminate distraction from other red areas. The lip region is extracted using morphological operations on the binary image [4]. Edge information is extracted using a Canny edge detector =-=[5]-=-. To combine the edge information with hue color information, we use the machinery of the Markov random field (MRF). The reason is twofold. First, extraction of lip features recovers the true image fr... |

3777 |
Introduction to Statistical Pattern Recognition
- Fukunaga
- 1972
(Show Context)
Citation Context ... in the training phase. This technique produces robust speaker models by maximizing the separation between classes. Polynomial classifiers have been used for pattern classification for many years [12]=-=[13]-=-, and have excellent properties as classifiers. Because of the Weierstrass approximation theorem, polynomials are universal approximators for the Bayes classifier [12]. The basic structure of our clas... |

791 | Multimodality image registration by maximization of mutual information
- Maes, Collignon, et al.
- 1997
(Show Context)
Citation Context ...Renyi entropy of order q, Rq, was one of the first generalized entropies to gain widespread attention. 13 It is expressed as: x, y 1 ⎞ ⎛ N ( ⎟ ⎜ ) = ∑ q Rq X ln pi , 1− ⎜ ⎟ q i ⎝ = 1 q ∈ℜ \ { 0, 1} . =-=(6)-=- ⎠ As can be verified with L’Hôpital’s Rule, Rq specializes to HS in the limit q → 1. Quadratic Renyi entropy (Renyi entropy of order q = 2) has been used in image registration. 2,10 Like Shannon entr... |

559 | Possible Generalizations of Boltzmann-Gibbs Statistics - Tsallis - 1988 |

545 | A Survey of Medical Image Registration
- Maintz, Viergever
- 1998
(Show Context)
Citation Context ...mation, Iα-information, and Mα-information. 9 One of the most successful of these metrics, Iα-information, is written as 9 : I α ) ⎟ ( ⎟ ⎞ ⎛ ∑ ⎜ × ⎜ ⎝ ⎠ ( X , Y) || X Y 1 = α ( α −1) α p ( x, y) −1 . =-=(5)-=- α −1 ( p( x) p( y)) Papers using Renyi entropy and Renyi divergence have also appeared. 2,10-12 The Renyi entropy of order q, Rq, was one of the first generalized entropies to gain widespread attenti... |

457 | Multi-modal volume registration by maximization of mutual information
- Wells, Viola, et al.
- 1996
(Show Context)
Citation Context ...herefore, mutual information using Rq can be defined analogously to that derived from HS. A normalized variant, I q , is given as: Proc. of SPIE Vol. 5032 1091s~ I R q Rq ( X ) + Rq ( Y) ( X , Y) = . =-=(7)-=- R ( X , Y) The Havrda-Charvat entropy is another early, generalized entropy measure. 14 C. Tsallis introduced a functionally equivalent entropy into the physics literature, where it became important ... |

445 |
An overlap invariant entropy measure of 3D medical image alignment
- Studholme, DLG, et al.
(Show Context)
Citation Context ...itivity). The Havrda-Charvat-Tsallis entropy of type q, henceforth simply called the Tsallis entropy, is denoted as Sq, and is expressed as 15 : q q 1−q [ p ( 1− p ) ] , q ∈ ℜ \ { 0, 1} N 1 S q i i . =-=(8)-=- 1− q = i= ∑ 1 The Tsallis measure exhibits pseudoadditivity: For independent random variables X and Y, the joint entropy of X and Y is the sum of the marginals and some function of the marginals. In ... |

437 | XM2VTSDB: The extended M2VTS database
- Messer, Matas, et al.
- 1999
(Show Context)
Citation Context ...ng a log probability. Thus, combining the classifier output from the audio and visual modalities is a simple matter of adding the class scores. 4. EXPERIMENTS 4.1. XM2VTS Database The XM2VTS database =-=[16]-=- is a large multimodal database created for automatic person recognition. In total, the database is composed of audio-only speech recordings, audio-visual speech recordings, and frontal and profile vi... |

333 | Probability Theory - Renyi - 1970 |

291 |
Computational vision and regularization theory
- Poggio, Torre, et al.
- 1985
(Show Context)
Citation Context ...RF). The reason is twofold. First, extraction of lip features recovers the true image from the noisy observed image. It is, therefore, an inverse problem with many possible solutions and is ill posed =-=[6]-=-. This problem can be solved by the use of regularization methods employed in the MRF framework. Second, the MRF formulation allows us to embed many features of interest by simply adding appropriate t... |

272 | Design and construction of a realistic digital brain phantom”, - Collins, Zijdenbos, et al. - 1998 |

269 | Speaker Recognition: A Tutorial
- Campbell
- 1997
(Show Context)
Citation Context ...gure 4: Lip segmentation. Figure 5: Lip feature extraction. 3. CLASSIFICATION 3.1. Polynomial Classifier Many classification methods are currently being applied to the problem of speaker verification =-=[8]-=-. Traditionally, statistical methods are used to model the speaker’s speech data from the feature extraction phase. Two of the most popular approaches are the Hidden Markov Model (HMM) [9] and the Gau... |

142 | Quantification method of classification processes. concept of structural α-entropy - Havrda, Charvat - 1967 |

110 | Image registration by maximization of combined mutual information and gradient information,’’ - Pluim, Maintz, et al. - 2000 |

92 | Comparative evaluation of multiresolution optimization strategies for multimodality image registration by maximization of mutual information,” - Vandermeulen, Maes, et al. - 1999 |

78 |
The use of cohort normalized scores for speaker veri®cation. In:
- Rosenberg, DeLong, et al.
- 1992
(Show Context)
Citation Context ...erification [8]. Traditionally, statistical methods are used to model the speaker’s speech data from the feature extraction phase. Two of the most popular approaches are the Hidden Markov Model (HMM) =-=[9]-=- and the Gaussian Mixture Model (GMM) [10]. More recently, discriminative classification techniques employing artificial neural networks, such as neural tree networks (NTN) [11], have been applied to ... |

77 | Measures of Information and their Applications. - Kapur - 1994 |

67 | Rigid registration of 3D ultrasound with MR images : a new approach combining intensity and gradient information
- Roche, Pennec, et al.
- 2001
(Show Context)
Citation Context ...een P(X, Y) and P(X)P(Y), and this value is known as mutual information, denoted as I(X, Y): D ij ( ( X , Y ) || X Y ) = I( X , Y ) = p( x, y) ln = H ( X ) + H ( Y) − H ( X , Y ) ij p( x, y) × p( ∑ . =-=(3)-=- x) p( y) x, y From (3), it is seen that mutual information indicates the degree to which X and Y are independent, with I(X, Y) = 0 denoting independence. In the context of biomedical image registrati... |

57 |
Comparing models for audiovisual fusion in a noisy-vowel recognition task
- Teissier, Robert-Ribes, et al.
- 1999
(Show Context)
Citation Context ...uth geometry (e.g., height and width of the mouth opening). A model-based approach uses parameterized models of the speech articulators. The various methods of combining the modalities are as follows =-=[2]-=-. With the direct identification model, the classifier uses the multimodal data directly. With separate identification, or late integration, there is a separate classifier for each modality. The resul... |

48 |
Speaker recognition using neural networks and conventional classifiers,
- Farrell, Mammone, et al.
- 1994
(Show Context)
Citation Context ...dden Markov Model (HMM) [9] and the Gaussian Mixture Model (GMM) [10]. More recently, discriminative classification techniques employing artificial neural networks, such as neural tree networks (NTN) =-=[11]-=-, have been applied to the problem. In order to provide the best performance for speaker verification systems, the latter methods include out-of-class data in the training phase. This technique produc... |

45 |
Video Demystified, A Handbook for the Digital Engineer, Second Edition, HighText Interactive, Inc.,
- Jack
- 1996
(Show Context)
Citation Context ...r itself. To separate the chromatic and luminance components, various transformed color spaces can be employed, such as the normalized RGB space (we denote it as rgb in the following), YCbCr, and HSV =-=[3]-=-. To analyze the statistics of each color model, we build histograms of color components. We construct histograms for the entire image and for the extracted lip region bounded within the estimated bou... |

41 | Automatic speaker recognition using gaussian mixture speaker models,
- Reynolds
- 1995
(Show Context)
Citation Context ...al methods are used to model the speaker’s speech data from the feature extraction phase. Two of the most popular approaches are the Hidden Markov Model (HMM) [9] and the Gaussian Mixture Model (GMM) =-=[10]-=-. More recently, discriminative classification techniques employing artificial neural networks, such as neural tree networks (NTN) [11], have been applied to the problem. In order to provide the best ... |

38 | Visual Speech and Speaker Recognition
- Luettin
- 1997
(Show Context)
Citation Context ...lips in the video sequence and then perform the feature extraction. Subsequently, for a multimodal system, the two domains must be integrated, or fused.sThere are several methods for lip localization =-=[1]-=-. Deformable templates use geometric shapes that are allowed to deform and move in order to minimize an energy function. Template matching traditionally employs correlation to locate facial features. ... |

34 |
Pattern Classification
- Schurmann
- 1996
(Show Context)
Citation Context ...data in the training phase. This technique produces robust speaker models by maximizing the separation between classes. Polynomial classifiers have been used for pattern classification for many years =-=[12]-=-[13], and have excellent properties as classifiers. Because of the Weierstrass approximation theorem, polynomials are universal approximators for the Bayes classifier [12]. The basic structure of our ... |

32 | Interpolation artefacts in mutual information-based image registration - Pluim, Maintz, et al. |

25 |
Towards multidimensional radiotherapy (MD-CRT): biological imaging and biological conformality. Int J Radiat Oncol Biol Phys
- CC, Humm, et al.
(Show Context)
Citation Context ...crease of uncertainty of a random variable A by the knowledge of another variable B. The entropies are calculated as: ∑ ( A 2 A H A, B) = −∑ pAB( a, b) ⋅ a a, b H A) = − p ( a) ⋅ log p ( a) ( log (2) =-=(3)-=- 2 pAB( a, b) Working with images, the probability density function is estimated by histogram, usually very easy to obtain. The optimum geometrical transformation T that registers two images will maxi... |

22 |
Assaleh, “Polynomial classifier techniques for speaker verification,” ICASSP
- Campbell, T
- 1999
(Show Context)
Citation Context ...polynomial expansion, w is of length 455, resulting in only 909 flops per transaction, and a model size of 1820 bytes for a floating point representation. An efficient method for training is given in =-=[14]-=-. 3.2. Multimodal Fusion A late integration approach is used to fuse the audio and visual modalities. It is necessary that the classifier outputs represent class probabilities. As demonstrated in [15]... |

18 | Robust and fast medical registration of 3d-multimodality data sets
- Capek, Mroz, et al.
(Show Context)
Citation Context .../$15.00sFor two discrete random variables X and Y with joint distribution p ≡ p(x, y) = pij, i = 1, 2, …, N, j = 1, 2, …, M, the joint entropy of X and Y is given as: S i N M H ( X , Y ) = − p ln p . =-=(2)-=- j = 1 = 1 ∑∑ An important property of HS is additivity: If X and Y are independent random variables, then p(x, y) = p(x)p(y), and HS(X, Y) = HS(X) + HS(Y). If X and Y are maximally correlated, then H... |

17 |
f -information measures in medical image registration
- Pluim, Maintz, et al.
- 2001
(Show Context)
Citation Context ...6 Therefore, the generalized mutual information of X and Y based on Tsallis entropy, denoted as I ( X , Y ) S , is given as 16 : q ( X , Y ) = S ( X ) + S ( Y) + ( 1− q) S ( X ) S ( Y) − S ( X , Y) . =-=(9)-=- S I q q q q q q A normalized measure based on (9) can also be derived. It has been shown that the upper bound on I ( X , Y) S q is given by 16 : ( S ( X ) S ( Y ) ) , Φ ∈{ S ( X ), S ( Y) } S , max, ... |

16 | Information-theoretical considerations on estimation problems - Arimoto - 1971 |

15 | 2001, `An information divergence measure for ISAR image registration
- Hamza, He, et al.
(Show Context)
Citation Context ... ( Y) } S , max, Φ I q ( X , Y) = Φ + ( 1− q) q q q q Thus, normalized mutual information based on nonadditive Tsallis entropy, ~ I , is expressed as: S q . (10) S ~ I S q I q ( X , Y ) = ∈ [ 0, 1] . =-=(11)-=- S , max, Φ I q The properties of these metrics can be predicted to some extent by analyzing their curves as functions of relative independence. In Figure 1, metrics were computed from 8-bit joint dis... |

14 |
Lip feature extraction towards an automatic speechreading system
- Zhang, Mersereau
- 2000
(Show Context)
Citation Context ... lip region bounded within the estimated boundary. From experiments on various video sequences taken under different test conditions and for different test subjects we have the following observations =-=[4]-=-: i) Color components (r,g,b), (Cb,Cr) and (H) exhibit peaks in their histograms. This indicates that the feature distribution of the lip region is narrow and implies that the color for the lip region... |

14 | Information gain within nonextensive thermostatistics - Borland, Plastino, et al. - 1998 |

13 | Non-Rayleigh Statistics of Ultrasonic Backscattered Signals - Narayanan, Shanker - 1994 |

12 |
P.: Registration Methodology: Concepts and Algorithms, chapter 3
- Hill, Batchelor
- 2001
(Show Context)
Citation Context ...nnon-Boltzmann-Gibbs (henceforth, Shannon) definition of entropy, denoted as HS. Let X denote a discrete random variable with distribution p ≡ p(x) = (p1, p2, …, pN) T . Then S N H ( X ) = − p ln p . =-=(1)-=- * Further author information: (Send correspondence to Mark Wachowiak) Email: mwach@imaging.robarts.ca. Telephone: +1 (519) 663-5777. Address: Imaging Laboratories, Robarts Research Institute, London,... |

11 |
Image registration with minimal spanning tree algorithm
- Ma, Hero, et al.
- 2000
(Show Context)
Citation Context ... 16 : ( S ( X ) S ( Y ) ) , Φ ∈{ S ( X ), S ( Y) } S , max, Φ I q ( X , Y) = Φ + ( 1− q) q q q q Thus, normalized mutual information based on nonadditive Tsallis entropy, ~ I , is expressed as: S q . =-=(10)-=- S ~ I S q I q ( X , Y ) = ∈ [ 0, 1] . (11) S , max, Φ I q The properties of these metrics can be predicted to some extent by analyzing their curves as functions of relative independence. In Figure 1,... |

10 | The role of image registration - Toga, Thompson - 2001 |

9 | Feature coincidence trees for registration of ultrasound breast images
- Neemuchwala, Hero, et al.
- 2001
(Show Context)
Citation Context ... some extent in terms of properties of the Tsallis entropy. Recall that information based on the Tsallis entropy is given as 16 : ( X , Y ) = S ( X ) + S ( Y) + ( 1 − q) S ( X ) S ( Y) − S ( X , Y) . =-=(12)-=- S I q q q q q q If X and Y are statistically independent, then Sq ( X , Y) = S( X ) + S( Y) + ( 1− q) S( X ) S( Y ) . The ratio of the Tsallis ~ S 16 ~ S information to the upper bound, I q , measure... |

9 | Mapping histology to metabolism: Coregistration of stained whole-brain sections to premortem PET in Alzheimer’s disease - Mega, Chen, et al. - 1997 |

9 | Improving treatment planning accuracy through multimodality imaging. International journal radiation oncology - Sailer - 1996 |

7 |
A Confidence-Based Approach to the Labeling Problem
- Chou, Brown, et al.
- 1987
(Show Context)
Citation Context ...rion is used to formulate what the best labeling should be. p( x | y) ∝ p( y | x) p( x) � ( yi − µ x ) i p( y | x) ∝ exp�− � 2 � � i 2σ xi � 1 � p( x) = exp�− �Vc ( x) � �� T c∈C �� The HCF algorithm =-=[7]-=- allows one to reduce the estimation problem to the minimization of an energy function. U ( y | x) = λ ( y − µ ) i xi � + 2 � 2σ c∈ i x C i Figure 4 illustrates the results of segmentation. The geomet... |

5 | MRI simulation-based evaluation of image processing and classification methods - Kwan, Evans, et al. - 1999 |

4 | A computationally scalable speaker recognition system
- Campbell, Broun
- 2000
(Show Context)
Citation Context ...[14]. 3.2. Multimodal Fusion A late integration approach is used to fuse the audio and visual modalities. It is necessary that the classifier outputs represent class probabilities. As demonstrated in =-=[15]-=-, the polynomial classifier discriminant function can be expressed as p( ω M j | x k ) d' ( x 1 ) = , p( ω ) M ∏ k = 1 where ωj is class j. Two simplifications are performed. First, we consider the lo... |

4 | Der Lubbe, The R-norm Information Measure - Boekee, Van - 1980 |

4 |
A.: Three-dimensional conformal therapy or standard irradiation in localized carcinoma of prostate: preliminary results of a nonrandomized comparison
- Perez, Michalski, et al.
- 2000
(Show Context)
Citation Context ...ase of complete independence pA(a)žpB(b) 8 . This concept is related to the entropy of the random variables by the equation: I( A, B) = H( A) + H( B) − H( A, B) = H( A) − H( A| B) = H( B) − H( B | A) =-=(1)-=- where I(A,B) represents the mutual information measurement between A and B, H(A) and H(B) are the entropies of these random variables, H(A,B) their joint entropy, and H(A|B) and H(B|A) the conditiona... |

3 |
Evaluation Protocol for the XM2VTS Database,” IDIAP-Com 98-05
- Maitre
- 1998
(Show Context)
Citation Context ...of 16 bits. The video is captured at a color sampling resolution of 4:2:0, and it is compressed at the fixed ratio of 5:1 in the DV format. The evaluation protocol for the XM2VTS database is given in =-=[17]-=-. There are two preferred configurations for training the system, determining parameters, and testing the performance. Configuration I provides for good expert training, but poor fusion training. Conf... |

2 |
Preliminary report of toxicity following 3D radiation therapy for prostate cancer on 3DOG/RTOG 9406
- Michalski, Winter, et al.
- 2000
(Show Context)
Citation Context ...e decrease of uncertainty of a random variable A by the knowledge of another variable B. The entropies are calculated as: ∑ ( A 2 A H A, B) = −∑ pAB( a, b) ⋅ a a, b H A) = − p ( a) ⋅ log p ( a) ( log =-=(2)-=- (3) 2 pAB( a, b) Working with images, the probability density function is estimated by histogram, usually very easy to obtain. The optimum geometrical transformation T that registers two images will ... |

2 | Application of magnetic resonance imaging and threedimensional treatment planning in the treatment of orbital lymphoma - Rudoltz, Ayyangar, et al. - 1993 |

2 | A Comparison of clinical target volumes determined by CT and MRI for the radiotherapy planning of base of skull meningiomas - Khoo, Adams, et al. - 2000 |

2 | Short communication: CT-MRI image fusion for 3D conformal prostate radiotherapy: use in patients with altered pelvic anatomy - Lau, Kagawa, et al. - 1996 |

2 | Possibilities of an open magnetic resonance scanner integration in therapy simulation and three-dimensional radiotherapy planning - Schubert, Wenz, et al. - 1999 |

1 | Trigonometric entropies, Jensen difference divergence measures, and error bounds - Sant’anna, Taneja - 1985 |

1 | Elmaghraby, “Generalized mutual information as a similarity metric for multimodal biomedical image registration - Wachowiak, Smolíková, et al. - 2002 |