Results

**1 - 3**of**3**### Low-Complexity Approaches to . . . Dissemination

, 2006

"... In this thesis we consider practical ways of disseminating information from multiple senders to multiple receivers in an optimal or provably close-to-optimal fashion. The basis for our discussion of optimal transmission of information is mostly information theoretic- but the methods that we apply to ..."

Abstract
- Add to MetaCart

In this thesis we consider practical ways of disseminating information from multiple senders to multiple receivers in an optimal or provably close-to-optimal fashion. The basis for our discussion of optimal transmission of information is mostly information theoretic- but the methods that we apply to do so in a low-complexity fashion draw from a number of different engineering disciplines. The three canonical multiple-input, multiple-output problems we focus our attention upon are: • The Slepian-Wolf problem where multiple correlated sources must be distribut-edly compressed and recovered with a common receiver. • The discrete memoryless multiple access problem where multiple senders com-municate across a common channel to a single receiver. • The deterministic broadcast channel problem where multiple messages are sent from a common sender to multiple receivers through a deterministic medium. Chapter 1 serves as an introduction and provides models, definitions, and a discus-sion of barriers between theory and practice for the three canonical data dissemination

### unknown title

"... Text Classification with Kernels on the Multinomial Manifold Support Vector Machines (SVMs) have been very successful in text classification. However, the intrinsic geometric structure of text data has been ignored by standard kernels commonly used in SVMs. It is natural to assume that the documents ..."

Abstract
- Add to MetaCart

(Show Context)
Text Classification with Kernels on the Multinomial Manifold Support Vector Machines (SVMs) have been very successful in text classification. However, the intrinsic geometric structure of text data has been ignored by standard kernels commonly used in SVMs. It is natural to assume that the documents are on the multinomial manifold, which is the simplex of multinomial models furnished with the Riemannian structure induced by the Fisher information metric. We prove that the Negative Geodesic Distance (NGD) on the multinomial manifold is conditionally positive definite (cpd), thus can be used as a kernel in SVMs. Experiments show the NGD kernel on the multinomial manifold to be effective for text classification, significantly outperforming standard kernels on the ambient Euclidean space. Categories and Subject Descriptors