### Table 1: Properties of movies in chosen target set. Ratings is total number of ratings (popularity), mean is average rating (likability), and entropy is the standard information-theoretic entropy of the ratings distribution. Recall that ratings are pro- vided on a 5-point scale.

"... In PAGE 5: ... In terms of ratings properties, this selection of items represents a wide range of pop- ularity (number of ratings), entropy (a measure of the variance of ratings), and likability (mean rating). Table1 displays the proper- ties of items in the target set. 3.... ..."

### Table 1: Properties of movies in chosen target set. Ratings is total number of ratings (popularity), mean is average rating (likability), and entropy is the standard information-theoretic entropy of the ratings distribution. Recall that ratings are pro- vided on a 5-point scale.

"... In PAGE 5: ... In terms of ratings properties, this selection of items represents a wide range of pop- ularity (number of ratings), entropy (a measure of the variance of ratings), and likability (mean rating). Table1 displays the proper- ties of items in the target set. 3.... ..."

### Table 4). Thus, in classical information-theoretic terms, K can be

2000

Cited by 8

### Table 1. Summary of three families of information-theoretic measures of diversity.

### Table 4. Running time on a set of 25 alignments randomly selected from the catalytic site data set. The Jensen-Shannon divergence takes several orders of magnitude less time than Rate4Site and provides competitive performance. All information theoretic methods have similar running times.

2007

"... In PAGE 5: ... However, JSD and the other information-theoretic methods have a significant advantage over R4S when considering run time. Table4 gives (processor) running time statistics for several methods on a benchmark set of 25 ran- domly chosen alignments from the CSA data set. R4S took over 2.... In PAGE 7: ... Our evaluation demonstrates that methods such as JSD and RE that incorporate a background amino acid distribution are prefera- ble to SE (Figure 1). R4S also provides similar improvement over SE, but is quite slow in comparison to the information theoretic methods ( Table4 ). The speed of JSD would allow researchers to modify alignments and re-predict functional sites on the fly.... ..."

### Table 2: Information theoretical measures (normalised)

### Table 1: Table of data describing who is and who is not sunburned, with attribute values (adapted from Winston, 1992). Consider the data in Table 1. Applying simple information-theoretic heuristics, the IDT in Figure 1 can be generated. Sunburned and non-sunburned individuals fall neatly into the same class at the leaves. This IDT can be used to generate four rules in a rule-based knowledge base, one for each path in the IDT: 1Note that there is no inconsistency between K1 and K2. 2

"... In PAGE 6: ...EDAGs) [22]. EDAGs can be interpreted as a set of rules with exceptions. The idea is that the root of the exception structure contains a default conclusion unless one of the children of the root contains a `concept apos; which leads to a conclusion which overrides the default conclusion. For instance, consider the four rules extracted from Table1 , i.e.... ..."

### Table 6.1: A possible classification of emergent intelligence, using an information-theoretic

2002

### Table 3.6: Features eliminated by information-theoretic feature selection.

### Table 3. Shannon and Davio expansions and their information measures Type Rule of Expansion Information theoretic measures

2000

"... In PAGE 2: ... Given a node assigned by the couple (DCBN AX), a function CU is characterized by the entropy C0B4CUB5, and the resulting successors are distinguished by the conditional entropy C0 AX B4CUCYDCB5. A step in decomposition of the function CU with respect to variable DC and expansion type AX is described in terms of information theory as follows C1 AX B4CUBN DCB5BPC0B4CUB5 A0 C0 AX B4CUCYDCB5BM (2) Table3 shows information theoretic measures of CB, D4BW and D2BW expansions for a switching function CU with respect to variable DC. Example 1.... ..."

Cited by 1