### Table 2 4.3 Video Information Fidelity We recently proposed a model that describes the statistics of natural video sequences, towards the development of an information theoretic quality metric for video signals [28]. Translational motion of local image patches was combined

### Table 4). Thus, in classical information-theoretic terms, K can be

2000

Cited by 8

### Table 1. Summary of three families of information-theoretic measures of diversity.

### Table 1. Information-theoretic analysis of grouping cues. (a) shows the results for individual features. (b) shows the results when pairs of intra- and inter- region cues are combined. The first column is the amount of information these features contain about the class label. The second column is the amount of residual information these features retain when conditioned on the model output. The marginal entropy of the class label is BDBMBC ( bits ).

2003

"... In PAGE 4: ... The distributions are normalized and the marginal entropy of CW is BDBMBC ( bits ). The first column of Table1 (a) shows the results for individ- ual features. We also combine each pair of inter- and intra- features together to evaluate the overall power of contour, texture, and brightness cues.... In PAGE 4: ... We also combine each pair of inter- and intra- features together to evaluate the overall power of contour, texture, and brightness cues. These results are listed in the first column of Table1 (b). From this analysis of mutual information we find that the presence of boundary contours is the most informative grouping cue.... In PAGE 5: ...The residual information is measured by the mutual information of CW and BY conditioned on BZ. The results have been listed in the second columns of Table1 . We observe that there is little residual information left in the features, which indi- cates that the linear classifier fits the data well.... ..."

Cited by 35

### Table 1: This table summarizes the results known about learning monomials and k- DNF formulas under various models of noise. Note that the upper bounds correspond to polynomial-time algorithms that can tolerate the given noise rate, and the lower bounds correspond to information-theoretic proofs that no algorithm can tolerate the given noise rate.

1995

"... In PAGE 6: ... In this case he shows that a large amount of noise can be handled where the irrelevant attributes are a ected by arbitrary adversarial noise and the relevant attributes are a ected by random noise independently of one another. In Table1 we summarize the results (from previous work and this paper) about PAC learning monomials and k-DNF formulas from the noise oracles discussed in the Section 3.... ..."

Cited by 30

### Table 2: Information theoretical measures (normalised)

### Table 6.1: A possible classification of emergent intelligence, using an information-theoretic

2002

### Table 3.6: Features eliminated by information-theoretic feature selection.

### Table 3. Shannon and Davio expansions and their information measures Type Rule of Expansion Information theoretic measures

2000

"... In PAGE 2: ... Given a node assigned by the couple (DCBN AX), a function CU is characterized by the entropy C0B4CUB5, and the resulting successors are distinguished by the conditional entropy C0 AX B4CUCYDCB5. A step in decomposition of the function CU with respect to variable DC and expansion type AX is described in terms of information theory as follows C1 AX B4CUBN DCB5BPC0B4CUB5 A0 C0 AX B4CUCYDCB5BM (2) Table3 shows information theoretic measures of CB, D4BW and D2BW expansions for a switching function CU with respect to variable DC. Example 1.... ..."

Cited by 1

### Table 4. Analogues of Shannon and Davio expansions in GF(4) and their information measures Type Rule of expansion Information theoretic measures

2000

"... In PAGE 2: ... 3.1 Information theoretic notations of BGA0CB, BGA0D4BW and D5A0BGA0D2BW expansions We propose the information measures of the expansion types, in GF(4), grouped in Table4 , where C2 CX B4DCB5, CX BP BCBNBMBMBMBNCZ A0 BD, are the characteristic functions; C2 CX B4DCB5 BP BD if DC BP CX and C2 CX B4DCB5BPBCotherwise. It was shown in [10] that BGA0CB expansion of a 4-valued logic function is equivalent to CB expansion of the function.... ..."

Cited by 1