### Table 5. Robustness of algorithms

"... In PAGE 11: ....3. Sensitivity analysis We present a statistical analysis of many runs of our al- gorithms, each run with a different seed to the random num- ber generator. Table5 summarizes the results of this exper- iment. In row 1, we present the statistics for the Lazarus count of the total order found by Algorithm A1.... ..."

### Table 3. Breakdown of the Performance of the Robust Algorithm

### Table 4. Breakdown of the Performance of the Sped-Up Robust Algorithm

"... In PAGE 10: ...2.3 Performance The matching performance of the sped-up robust algorithm is shown in Table4 . Interestingly, the performance is actually better than the robust version.... ..."

### Table 10: Robustness of Algorithm-B for an AVERAGE Query on the synthetic database

"... In PAGE 16: ... Each test was repeated 10 times and the results averaged. Relative errors are tabulated in Table10 It is observed that the relative errors are stable and thus the function is reliable. Algorithm-C on the solar data set Algorithm-C was tested using a 50-term Fourier series.... ..."

### Table 11: Robustness of Algorithm-C for an AVERAGE query

"... In PAGE 17: ... We then repeated the tests with the sliding windows. The results are tabulated in Table11 . One can truly observe that the relative errors are very small and stable.... ..."

### Table 1 A robust FRLS algorithm.

"... In PAGE 2: ... This method works very well in all of the simulations that have been done. In Table1 , we give a robust FRLS algorithm with a com- plexity of O(7L). One other important part of the algorithm is the estimate of the scale factor s.... ..."

### Table 1: Listingoftheproposed robust ltering algorithm in

"... In PAGE 3: ... This problem can be seen to be the robust version of (27) in the same way that (4){(5) is the robust version of (1). Now (31) can be written more com- pactly in the form (4){(5) with the identi cations: x ; colfx i ; ^ x iji ;; u i g;; b ; y i+1 ; H i+1 F i ^ x iji A ; H i+1 M i i E f;;i E g;;i b ; ;H i+1 M i i E f;;i ^ x iji ;; Q ; (P ;1 iji Q ;1 i ) W ; R ;1 i+1 ;; H ; H i+1 M i ;; E a ; E f;;i E g;;i E b ; ;E f;;i ^ x iji ;; ; i ;; A ; H i+1 F i G i This leads, after some algebra, to the equations shown in Table1 where we de ned 2 l;;i = kM T i H T i+1 R ;1 i+1 H i+1 M i k (32) The major step in the algorithm of Table 1 is step 3, which consists of recursions that are very similar in nature to the prediction form of the Kalman lter. The main di erence is that the new recursions operate on modi ed parameters rather than on the given nominal values.... In PAGE 3: ... This problem can be seen to be the robust version of (27) in the same way that (4){(5) is the robust version of (1). Now (31) can be written more com- pactly in the form (4){(5) with the identi cations: x ; colfx i ; ^ x iji ;; u i g;; b ; y i+1 ; H i+1 F i ^ x iji A ; H i+1 M i i E f;;i E g;;i b ; ;H i+1 M i i E f;;i ^ x iji ;; Q ; (P ;1 iji Q ;1 i ) W ; R ;1 i+1 ;; H ; H i+1 M i ;; E a ; E f;;i E g;;i E b ; ;E f;;i ^ x iji ;; ; i ;; A ; H i+1 F i G i This leads, after some algebra, to the equations shown in Table 1 where we de ned 2 l;;i = kM T i H T i+1 R ;1 i+1 H i+1 M i k (32) The major step in the algorithm of Table1 is step 3, which consists of recursions that are very similar in nature to the prediction form of the Kalman lter. The main di erence is that the new recursions operate on modi ed parameters rather than on the given nominal values.... In PAGE 3: ... Likewise, in the case E T f;;i E g;;i =0,we obtain the same sim- pli cations for f b G i ;; b F i g while b Q i becomes b Q i = ; Q ;1 i + ^ i E T g;;i E g;;i ;1 : In both cases, the recursion for P i becomes a standard Riccati recursion. In the work [7], we have further pre- sented alternativeequivalent implementations of the robust lter of Table1 in information form and in time- and... In PAGE 3: ...Table 1: Listingoftheproposed robust ltering algorithm in prediction form. Observe further that the algorithm of Table1 requires, at each iteration i, the minimization of G( )over ( l;;i ;; 1). It turns out that a reasonable approximation that avoids these repeated minimizations is to choose ^ i = (1 + ) l;;i : (33) That is, we set ^ i at a multiple of the lower bound | if the lower bound is zero, we set ^ i to zero and replace ^ ;1... In PAGE 4: ... 4. STEADY-STATE RESULTS Wenow examine the steady-state performance of the lter of Table1 when the model parameters are constant, say fF;;G;;H;;M;;E f ;;E g ;;Q;;Rg: [Only the contraction i is allowed to vary with time.] In particular we shall estab- lish that, under certain detectability and stabilizability assumptions, the steady-state lter is stable and that, in addition, for quadratically-stable models it guarantees a bounded error variance.... ..."

### Table 2: Listing of the robust filtering algorithm in predic- tion form.

2001

"... In PAGE 7: ...7 in some useful special cases. Table2 summarizes the prediction form of the robust algorithm. Assumed uncertain model.... ..."

Cited by 9

### Table 1: FLOP count for each of the key steps of the robust Algorithm 4.2.

2005

"... In PAGE 18: ... In consideration of this, the paper will profile computational requirement via FLOP load per it- eration, and for this purpose a detailed audit of the FLOP count for the key stages of the robust EM Algorithm 4.2 are provided in Table1 (Recall m is the number of inputs, p is the number of outputs, n is the model state dimension, and N is the data length) where the FLOP counts provided are for the computation of all N quantities such as P 1=2 tjt where necessary. Assuming the typical case of state... ..."

Cited by 5

### Table 1. The simulation study to test the robustness of the EM algorithm

2006

"... In PAGE 3: ... Four isoforms can be generated from this gene: the isoform with all exons, the isoform without exon 2, the isoform without exon 9, and the isoform without exon 2 and 9 (see Table 1). We set certain fixed probabilities for these isoforms (see Table1 ). We generated a simulated expressed sequence using the following three-step procedure: (i) randomly sam- ple the four isoforms, to generate a full-length mRNA, (ii) randomly sample the empirical distribution of the length of ESTs (taken from human UniGene data, see Supplement- ary Data), to decide the length of the simulated expressed sequence and (iii) randomly truncate the simulated mRNA according to the length obtained from the previous step, to make the simulated expressed sequence.... In PAGE 7: ... (A) Simulation studies using fixed probabilities of four isoforms. The probabilities are listed in Table1 . (B) A simulation study using randomized probabilities of four isoforms.... ..."