### Table 1: Algorithm for planning in low-dimensional belief space.

in Abstract

"... In PAGE 4: ... Our conversion algorithm is a variant of the Augmented MDP, or Coastal Navigation algorithm [9], using belief features instead of entropy. Table1 outlines the steps of this... ..."

### Table 6: Illumination results when considering a previous gender classifier. This table must be com- pared with table (2). In this particular case, AR 07 is specially improved in low dimensional spaces.

### Table 1: Unsupervised distance metric learning methods. This group of methods es- sentially learn a low-dimensional embedding of the original feature space; and can be categorized along two dimensions: preserving glocal structure vs. preserving local structure; and linear vs. nonlinear

2007

### Table 2. Various methods and algorithms mentioned in section 3.1 and their ability to confront effectively the issues mentioned in the same section (Incremental Updates, Performance in Text Classification Tasks, High Dimensionality, Low Computational Cost, Concept Drift, Dynamic Feature Space).

"... In PAGE 7: ...7 complexity for training the filtering models, updating them and providing recommen- dations. In Table2 , we summarize the basic characteristics of the aforementioned systems in terms of the issues discussed in this section. Table 2.... ..."

### Table 3: LOOCV error rates in the original space

in Oleg Okun ⋆

"... In PAGE 5: ... First, we performed two-class discrimination in the original, high dimensional space of 822 genes. Error rates for three NN classifiers are shown in Table3 when using leave-one-out cross-validation (LOOCV). These results will serve for a compar- ison with those obtained in low dimensional gene selection-induced space.... ..."

### Table 1: Table of computation times using optimal ordering algorithm Visualization Algorithm Complexity Dataset Data Number Dimensionality Time

2004

"... In PAGE 7: ... Therefore, we can do the optimal search with only low dimensionality datasets. To get a quantita- tive understanding of this issue, we performed a few experiments for different visualizations, and the results obtained are presented in Table1 . We realized that even in a low dimensional data space - around 10 dimensions - the computational overhead could be sig- nificant.... ..."

Cited by 14

### Table 2: Classification accuracies (%) of various dimensional spaces.

2007

"... In PAGE 14: ...thers. Similar observation was also made in previous studies, e.g. in [35]. Therefore, in this experiment the 1-NN classifier is used. Table2 shows the classification accuracy results for different dimensions of feature spaces on the ORL data sets. The accuracy of GLLE is 95.... ..."

### Table 2. The proportion of square modular matrices of low-dimensional kernel.

1999

Cited by 19

### Table 1. Average MAEs for both neighborhood dimensions high-dimensional low-dimensional

"... In PAGE 9: ... Figure 3 includes the Mean Absolute Errors for high (ib) and low (svd-ib) di- mensions, as observed for each of the 5 data splits of the data set. These error values are then averaged and Table1 records the flnal results for both implemen- tations. From both the preceding flgure and table, we can conclude that applying Item- based Filtering on the low-rank neighborhood, provides a clear improvement over the higher dimension neighborhood.... ..."

Cited by 1

### Table 2. (a) Stress and (b) intrinsic dimensionality of reduced datasets

"... In PAGE 9: ... 2. Distance distribution histograms for Deviation metrics and (a) vector model, (b) FastMap The stress, summarised in Table2 a is quite low for both LSI and random projection, however in case of FastMap are the deviations not well-preserved. From the look at distance distribution histograms of original and FastMap re- duced space in Figure 2 one can observe that the distances are highly reduced.... In PAGE 9: ... The question, if the change affects only the dissimilarity threshold will be partly solved in the next section. In Table2 b, we can observe high intrinsic dimensions for both LSI vari- ants and especially for random projection, whilst the intrinsic dimension for FastMap is surprisingly low. Additional tests on real data structures are re- quired for FastMap, to verify the indexability of reduced data.... ..."