### Table 1: This table summarizes the results known about learning monomials and k- DNF formulas under various models of noise. Note that the upper bounds correspond to polynomial-time algorithms that can tolerate the given noise rate, and the lower bounds correspond to information-theoretic proofs that no algorithm can tolerate the given noise rate.

1995

"... In PAGE 6: ... In this case he shows that a large amount of noise can be handled where the irrelevant attributes are a ected by arbitrary adversarial noise and the relevant attributes are a ected by random noise independently of one another. In Table1 we summarize the results (from previous work and this paper) about PAC learning monomials and k-DNF formulas from the noise oracles discussed in the Section 3.... ..."

Cited by 30

### Table 4). Thus, in classical information-theoretic terms, K can be

2000

Cited by 8

### Table 1. Summary of three families of information-theoretic measures of diversity.

### Table 2: Information theoretical measures (normalised)

### Table 6.1: A possible classification of emergent intelligence, using an information-theoretic

2002

### Table 3.6: Features eliminated by information-theoretic feature selection.

### Table 4. Fractions of n stored in Suffix Lists and Hashing with Open Addressing.

2001

"... In PAGE 8: ... If we properly bias the integral limits we can be sure to compute a lower bound log n r Z n n?r +1 log(x) dx ? Z r +1 2 log(x) dx: Maximizing r with respect to this equation yields an information theoretic upper bound. Table4 compares suffix lists with hashing and open addressing. The constants for suffix lists are chosen so that 2 c1 + c2 1=10 which means that if r elements can be treated, we set aside r=10 bits to speed-up internal computations.... ..."

Cited by 3

### Table 3. Shannon and Davio expansions and their information measures Type Rule of Expansion Information theoretic measures

2000

"... In PAGE 2: ... Given a node assigned by the couple (DCBN AX), a function CU is characterized by the entropy C0B4CUB5, and the resulting successors are distinguished by the conditional entropy C0 AX B4CUCYDCB5. A step in decomposition of the function CU with respect to variable DC and expansion type AX is described in terms of information theory as follows C1 AX B4CUBN DCB5BPC0B4CUB5 A0 C0 AX B4CUCYDCB5BM (2) Table3 shows information theoretic measures of CB, D4BW and D2BW expansions for a switching function CU with respect to variable DC. Example 1.... ..."

Cited by 1

### Table 4. Analogues of Shannon and Davio expansions in GF(4) and their information measures Type Rule of expansion Information theoretic measures

2000

"... In PAGE 2: ... 3.1 Information theoretic notations of BGA0CB, BGA0D4BW and D5A0BGA0D2BW expansions We propose the information measures of the expansion types, in GF(4), grouped in Table4 , where C2 CX B4DCB5, CX BP BCBNBMBMBMBNCZ A0 BD, are the characteristic functions; C2 CX B4DCB5 BP BD if DC BP CX and C2 CX B4DCB5BPBCotherwise. It was shown in [10] that BGA0CB expansion of a 4-valued logic function is equivalent to CB expansion of the function.... ..."

Cited by 1